Sample records for unique analytical tools

  1. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  2. A Simpli ed, General Approach to Simulating from Multivariate Copula Functions

    Treesearch

    Barry Goodwin

    2012-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...

  3. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  4. Mechanical and Electronic Approaches to Improve the Sensitivity of Microcantilever Sensors

    PubMed Central

    Mutyala, Madhu Santosh Ku; Bandhanadham, Deepika; Pan, Liu; Pendyala, Vijaya Rohini; Ji, Hai-Feng

    2010-01-01

    Advances in the field of Micro Electro Mechanical Systems (MEMS) and their uses now offer unique opportunities in the design of ultrasensitive analytical tools. The analytical community continues to search for cost-effective, reliable, and even portable analytical techniques that can give reliable and fast response results for a variety of chemicals and biomolecules. Microcantilevers (MCLs) have emerged as a unique platform for label-free biosensor or bioassay. Several electronic designs, including piezoresistive, piezoelectric, and capacitive approaches, have been applied to measure the bending or frequency change of the MCLs upon exposure to chemicals. This review summarizes mechanical, fabrication, and electronics approaches to increase the sensitivity of microcantilever (MCL) sensors. PMID:20975987

  5. AMOEBA: Designing for Collaboration in Computer Science Classrooms through Live Learning Analytics

    ERIC Educational Resources Information Center

    Berland, Matthew; Davis, Don; Smith, Carmen Petrick

    2015-01-01

    AMOEBA is a unique tool to support teachers' orchestration of collaboration among novice programmers in a non-traditional programming environment. The AMOEBA tool was designed and utilized to facilitate collaboration in a classroom setting in real time among novice middle school and high school programmers utilizing the IPRO programming…

  6. Manipulability, force, and compliance analysis for planar continuum manipulators

    NASA Technical Reports Server (NTRS)

    Gravagne, Ian A.; Walker, Ian D.

    2002-01-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  7. Manipulability, force, and compliance analysis for planar continuum manipulators.

    PubMed

    Gravagne, Ian A; Walker, Ian D

    2002-06-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  8. A note on a simplified and general approach to simulating from multivariate copula functions

    Treesearch

    Barry K. Goodwin

    2013-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses ‘Probability-...

  9. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Physics of cosmological cascades and observable properties

    NASA Astrophysics Data System (ADS)

    Fitoussi, T.; Belmont, R.; Malzac, J.; Marcowith, A.; Cohen-Tanugi, J.; Jean, P.

    2017-04-01

    TeV photons from extragalactic sources are absorbed in the intergalactic medium and initiate electromagnetic cascades. These cascades offer a unique tool to probe the properties of the universe at cosmological scales. We present a new Monte Carlo code dedicated to the physics of such cascades. This code has been tested against both published results and analytical approximations, and is made publicly available. Using this numerical tool, we investigate the main cascade properties (spectrum, halo extension and time delays), and study in detail their dependence on the physical parameters (extragalactic magnetic field, extragalactic background light, source redshift, source spectrum and beaming emission). The limitations of analytical solutions are emphasized. In particular, analytical approximations account only for the first generation of photons and higher branches of the cascade tree are neglected.

  11. A Shoebox Polarimeter: An Inexpensive Analytical Tool for Teachers and Students

    ERIC Educational Resources Information Center

    Mehta, Akash; Greenbowe, Thomas J.

    2011-01-01

    A polarimeter can determine the optical activity of an organic or inorganic compound by providing information about the optical rotation of plane-polarized light when transmitted through that compound. This "Journal" has reported various construction methods for polarimeters. We report a unique construction using a shoebox, recycled office…

  12. Vortex-Lattice Utilization. [in aeronautical engineering and aircraft design

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The many novel, innovative, and unique implementations and applications of the vortex-lattice method to aerodynamic design and analysis which have been performed by Industry, Government, and Universities were presented. Although this analytical tool is not new, it continues to be utilized and refined in the aeronautical community.

  13. Surprises and insights from long-term aquatic datasets and experiments

    Treesearch

    Walter K. Dodds; Christopher T. Robinson; Evelyn E. Gaiser; Gretchen J.A. Hansen; Heather Powell; Joseph M. Smith; Nathaniel B. Morse; Sherri L. Johnson; Stanley V. Gregory; Tisza Bell; Timothy K. Kratz; William H. McDowell

    2012-01-01

    Long-term research on freshwater ecosystems provides insights that can be difficult to obtain from other approaches. Widespread monitoring of ecologically relevant water-quality parameters spanning decades can facilitate important tests of ecological principles. Unique long-term data sets and analytical tools are increasingly available, allowing for powerful and...

  14. Genomics Portals: integrative web-platform for mining genomics data.

    PubMed

    Shinde, Kaustubh; Phatak, Mukta; Johannes, Freudenberg M; Chen, Jing; Li, Qian; Vineet, Joshi K; Hu, Zhen; Ghosh, Krishnendu; Meller, Jaroslaw; Medvedovic, Mario

    2010-01-13

    A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.

  15. Genomics Portals: integrative web-platform for mining genomics data

    PubMed Central

    2010-01-01

    Background A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Results Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. Conclusion The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org. PMID:20070909

  16. Analysis of Distribution System and Domestic Service Line Pipe Deposits to Understand Water Treatment/Metal Release Relationships

    EPA Science Inventory

    This project puts the U.S. Environmental Protection Agency (EPA) into a unique position of being able to bring analytical tools to bear to solve or anticipate future drinking water infrastructure water quality and metallic or cement material performance problems, for which little...

  17. Single Cell Proteomics in Biomedicine: High-dimensional Data Acquisition, Visualization and Analysis

    PubMed Central

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-01-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. PMID:28128880

  18. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  19. Fusion Analytics: A Data Integration System for Public Health and Medical Disaster Response Decision Support

    PubMed Central

    Passman, Dina B.

    2013-01-01

    Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending to support leveraging this data for decision support with robust analytics and visualizations. Fusion Analytics provides an opportunity for attendees to see how various types of data are integrated into a single application for population health decision support. It also can provide them with ideas of how they can use their own staff to create analyses and reports that support their public health activities.

  20. Single cell proteomics in biomedicine: High-dimensional data acquisition, visualization, and analysis.

    PubMed

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-02-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features, and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Parametrization of local CR automorphisms by finite jets and applications

    NASA Astrophysics Data System (ADS)

    Lamel, Bernhard; Mir, Nordine

    2007-04-01

    For any real-analytic hypersurface Msubset {C}^N , which does not contain any complex-analytic subvariety of positive dimension, we show that for every point pin M the local real-analytic CR automorphisms of M fixing p can be parametrized real-analytically by their ell_p jets at p . As a direct application, we derive a Lie group structure for the topological group operatorname{Aut}(M,p) . Furthermore, we also show that the order ell_p of the jet space in which the group operatorname{Aut}(M,p) embeds can be chosen to depend upper-semicontinuously on p . As a first consequence, it follows that given any compact real-analytic hypersurface M in {C}^N , there exists an integer k depending only on M such that for every point pin M germs at p of CR diffeomorphisms mapping M into another real-analytic hypersurface in {C}^N are uniquely determined by their k -jet at that point. Another consequence is the following boundary version of H. Cartan's uniqueness theorem: given any bounded domain Ω with smooth real-analytic boundary, there exists an integer k depending only on partial Ω such that if H\\colon Ωto Ω is a proper holomorphic mapping extending smoothly up to partial Ω near some point pin partial Ω with the same k -jet at p with that of the identity mapping, then necessarily H=Id . Our parametrization theorem also holds for the stability group of any essentially finite minimal real-analytic CR manifold of arbitrary codimension. One of the new main tools developed in the paper, which may be of independent interest, is a parametrization theorem for invertible solutions of a certain kind of singular analytic equations, which roughly speaking consists of inverting certain families of parametrized maps with singularities.

  2. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-04-30

    flunk this basic test from their inception. —Honorable Ashton B. Carter (2010), Under Secretary of Defense for Acquisition, Technology, and Logistics... Testing , and Evaluation] funding has been lost to cancelled programs. (Decker & Wagner, 2011) The Army is scarcely unique in this regard. All... econometric model of how schedule affects cost should take advantage of these different cost categories and treat them separately when they are known

  3. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  4. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  5. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  6. 21st century toolkit for optimizing population health through precision nutrition.

    PubMed

    O'Sullivan, Aifric; Henrick, Bethany; Dixon, Bonnie; Barile, Daniela; Zivkovic, Angela; Smilowitz, Jennifer; Lemay, Danielle; Martin, William; German, J Bruce; Schaefer, Sara Elizabeth

    2017-07-05

    Scientific, technological, and economic progress over the last 100 years all but eradicated problems of widespread food shortage and nutrient deficiency in developed nations. But now society is faced with a new set of nutrition problems related to energy imbalance and metabolic disease, which require new kinds of solutions. Recent developments in the area of new analytical tools enable us to systematically study large quantities of detailed and multidimensional metabolic and health data, providing the opportunity to address current nutrition problems through an approach called Precision Nutrition. This approach integrates different kinds of "big data" to expand our understanding of the complexity and diversity of human metabolism in response to diet. With these tools, we can more fully elucidate each individual's unique phenotype, or the current state of health, as determined by the interactions among biology, environment, and behavior. The tools of precision nutrition include genomics, metabolomics, microbiomics, phenotyping, high-throughput analytical chemistry techniques, longitudinal tracking with body sensors, informatics, data science, and sophisticated educational and behavioral interventions. These tools are enabling the development of more personalized and predictive dietary guidance and interventions that have the potential to transform how the public makes food choices and greatly improve population health.

  7. Bacterial discrimination by means of a universal array approach mediated by LDR (ligase detection reaction)

    PubMed Central

    Busti, Elena; Bordoni, Roberta; Castiglioni, Bianca; Monciardini, Paolo; Sosio, Margherita; Donadio, Stefano; Consolandi, Clarissa; Rossi Bernardi, Luigi; Battaglia, Cristina; De Bellis, Gianluca

    2002-01-01

    Background PCR amplification of bacterial 16S rRNA genes provides the most comprehensive and flexible means of sampling bacterial communities. Sequence analysis of these cloned fragments can provide a qualitative and quantitative insight of the microbial population under scrutiny although this approach is not suited to large-scale screenings. Other methods, such as denaturing gradient gel electrophoresis, heteroduplex or terminal restriction fragment analysis are rapid and therefore amenable to field-scale experiments. A very recent addition to these analytical tools is represented by microarray technology. Results Here we present our results using a Universal DNA Microarray approach as an analytical tool for bacterial discrimination. The proposed procedure is based on the properties of the DNA ligation reaction and requires the design of two probes specific for each target sequence. One oligo carries a fluorescent label and the other a unique sequence (cZipCode or complementary ZipCode) which identifies a ligation product. Ligated fragments, obtained in presence of a proper template (a PCR amplified fragment of the 16s rRNA gene) contain either the fluorescent label or the unique sequence and therefore are addressed to the location on the microarray where the ZipCode sequence has been spotted. Such an array is therefore "Universal" being unrelated to a specific molecular analysis. Here we present the design of probes specific for some groups of bacteria and their application to bacterial diagnostics. Conclusions The combined use of selective probes, ligation reaction and the Universal Array approach yielded an analytical procedure with a good power of discrimination among bacteria. PMID:12243651

  8. A review of the analytical simulation of aircraft crash dynamics

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Carden, Huey D.; Boitnott, Richard L.; Hayduk, Robert J.

    1990-01-01

    A large number of full scale tests of general aviation aircraft, helicopters, and one unique air-to-ground controlled impact of a transport aircraft were performed. Additionally, research was also conducted on seat dynamic performance, load-limiting seats, load limiting subfloor designs, and emergency-locator-transmitters (ELTs). Computer programs were developed to provide designers with methods for predicting accelerations, velocities, and displacements of collapsing structure and for estimating the human response to crash loads. The results of full scale aircraft and component tests were used to verify and guide the development of analytical simulation tools and to demonstrate impact load attenuating concepts. Analytical simulation of metal and composite aircraft crash dynamics are addressed. Finite element models are examined to determine their degree of corroboration by experimental data and to reveal deficiencies requiring further development.

  9. Analyte-Responsive Hydrogels: Intelligent Materials for Biosensing and Drug Delivery.

    PubMed

    Culver, Heidi R; Clegg, John R; Peppas, Nicholas A

    2017-02-21

    Nature has mastered the art of molecular recognition. For example, using synergistic non-covalent interactions, proteins can distinguish between molecules and bind a partner with incredible affinity and specificity. Scientists have developed, and continue to develop, techniques to investigate and better understand molecular recognition. As a consequence, analyte-responsive hydrogels that mimic these recognitive processes have emerged as a class of intelligent materials. These materials are unique not only in the type of analyte to which they respond but also in how molecular recognition is achieved and how the hydrogel responds to the analyte. Traditional intelligent hydrogels can respond to environmental cues such as pH, temperature, and ionic strength. The functional monomers used to make these hydrogels can be varied to achieve responsive behavior. For analyte-responsive hydrogels, molecular recognition can also be achieved by incorporating biomolecules with inherent molecular recognition properties (e.g., nucleic acids, peptides, enzymes, etc.) into the polymer network. Furthermore, in addition to typical swelling/syneresis responses, these materials exhibit unique responsive behaviors, such as gel assembly or disassembly, upon interaction with the target analyte. With the diverse tools available for molecular recognition and the ability to generate unique responsive behaviors, analyte-responsive hydrogels have found great utility in a wide range of applications. In this Account, we discuss strategies for making four different classes of analyte-responsive hydrogels, specifically, non-imprinted, molecularly imprinted, biomolecule-containing, and enzymatically responsive hydrogels. Then we explore how these materials have been incorporated into sensors and drug delivery systems, highlighting examples that demonstrate the versatility of these materials. For example, in addition to the molecular recognition properties of analyte-responsive hydrogels, the physicochemical changes that are induced upon analyte binding can be exploited to generate a detectable signal for sensing applications. As research in this area has grown, a number of creative approaches for improving the selectivity and sensitivity (i.e., detection limit) of these sensors have emerged. For applications in drug delivery systems, therapeutic release can be triggered by competitive molecular interactions or physicochemical changes in the network. Additionally, including degradable units within the network can enable sustained and responsive therapeutic release. Several exciting examples exploiting the analyte-responsive behavior of hydrogels for the treatment of cancer, diabetes, and irritable bowel syndrome are discussed in detail. We expect that creative and combinatorial approaches used in the design of analyte-responsive hydrogels will continue to yield materials with great potential in the fields of sensing and drug delivery.

  10. SIFT-MS and FA-MS methods for ambient gas phase analysis: developments and applications in the UK.

    PubMed

    Smith, David; Španěl, Patrik

    2015-04-21

    Selected ion flow tube mass spectrometry, SIFT-MS, a relatively new gas/vapour phase analytical method, is derived from the much earlier selected ion flow tube, SIFT, used for the study of gas phase ion-molecule reactions. Both the SIFT and SIFT-MS techniques were conceived and developed in the UK, the former at Birmingham University, the latter at Keele University along with the complementary flowing afterglow mass spectrometry, FA-MS, technique. The focus of this short review is largely to describe the origins, developments and, most importantly, the unique features of SIFT-MS as an analytical tool for ambient analysis and to indicate its growing use to analyse humid air, especially exhaled breath, its unique place as a on-line, real time analytical method and its growing use and applications as a non-invasive diagnostic in clinical diagnosis and therapeutic monitoring, principally within several UK universities and hospitals, and briefly in the wider world. A few case studies are outlined that show the potential of SIFT-MS and FA-MS in the detection and quantification of metabolites in exhaled breath as a step towards recognising pathophysiology indicative of disease and the presence of bacterial and fungal infection of the airways and lungs. Particular cases include the detection of Pseudomonas aeruginosa infection of the airways of patients with cystic fibrosis (SIFT-MS) and the measurement of total body water in patients with chronic kidney disease (FA-MS). The growing exploitation of SIFT-MS in other areas of research and commerce are briefly listed to show the wide utility of this unique UK-developed analytical method, and future prospects and developments are alluded to.

  11. Proteoglycomics: Recent Progress and Future Challenges

    PubMed Central

    Ly, Mellisa; Laremore, Tatiana N.

    2010-01-01

    Abstract Proteoglycomics is a systematic study of structure, expression, and function of proteoglycans, a posttranslationally modified subset of a proteome. Although relying on the established technologies of proteomics and glycomics, proteoglycomics research requires unique approaches for elucidating structure–function relationships of both proteoglycan components, glycosaminoglycan chain, and core protein. This review discusses our current understanding of structure and function of proteoglycans, major players in the development, normal physiology, and disease. A brief outline of the proteoglycomic sample preparation and analysis is provided along with examples of several recent proteoglycomic studies. Unique challenges in the characterization of glycosaminoglycan component of proteoglycans are discussed, with emphasis on the many analytical tools used and the types of information they provide. PMID:20450439

  12. Development and biological applications of optical tweezers and Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Xie, Chang'an

    Optical tweezers is a three-dimensional manipulation tool that employs a gradient force that originates from the single highly focused laser beam. Raman spectroscopy is a molecular analytical tool that can give a highly unique "fingerprint" for each substance by measuring the unique vibrations of its molecules. The combination of these two optical techniques offers a new tool for the manipulation and identification of single biological cells and microscopic particles. In this thesis, we designed and implemented a Laser-Tweezers-Raman-Spectroscopy (LTRS) system, also called the Raman-tweezers, for the simultaneous capture and analysis of both biological particles and non-biological particles. We show that microparticles can be conveniently captured at the focus of a laser beam and the Raman spectra of trapped particles can be acquired with high quality. The LTRS system overcomes the intrinsic Brownian motion and cell motility of microparticles in solution and provides a promising tool for in situ identifying suspicious agents. In order to increase the signal to noise ratio, several schemes were employed in LTRS system to reduce the blank noise and the fluorescence signal coming from analytes and the surrounding background. These techniques include near-infrared excitation, optical levitation, confocal microscopy, and frequency-shifted Raman difference. The LTRS system has been applied for the study in cell biology at the single cell level. With the built Raman-tweezers system, we studied the dynamic physiological processes of single living cells, including cell cycle, the transcription and translation of recombinant protein in transgenic yeast cells and the T cell activation. We also studied cell damage and associated biochemical processes in optical traps, UV radiations, and evaluated heating by near-infrared Raman spectroscopy. These studies show that the Raman-tweezers system is feasible to provide rapid and reliable diagnosis of cellular disorders and can be used as a valuable tool to study cellular processes within single living cells or intracellular organelles and may aid research in molecular and cellular biology.

  13. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  14. Bioinspired Methodology for Artificial Olfaction

    PubMed Central

    Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve

    2008-01-01

    Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409

  15. Thin silica shell coated Ag assembled nanostructures for expanding generality of SERS analytes

    PubMed Central

    Kang, Yoo-Lee; Lee, Minwoo; Kang, Homan; Kim, Jaehi; Pham, Xuan-Hung; Kim, Tae Han; Hahm, Eunil; Lee, Yoon-Sik; Jeong, Dae Hong

    2017-01-01

    Surface-enhanced Raman scattering (SERS) provides a unique non-destructive spectroscopic fingerprint for chemical detection. However, intrinsic differences in affinity of analyte molecules to metal surface hinder SERS as a universal quantitative detection tool for various analyte molecules simultaneously. This must be overcome while keeping close proximity of analyte molecules to the metal surface. Moreover, assembled metal nanoparticles (NPs) structures might be beneficial for sensitive and reliable detection of chemicals than single NP structures. For this purpose, here we introduce thin silica-coated and assembled Ag NPs (SiO2@Ag@SiO2 NPs) for simultaneous and quantitative detection of chemicals that have different intrinsic affinities to silver metal. These SiO2@Ag@SiO2 NPs could detect each SERS peak of aniline or 4-aminothiophenol (4-ATP) from the mixture with limits of detection (LOD) of 93 ppm and 54 ppb, respectively. E-field distribution based on interparticle distance was simulated using discrete dipole approximation (DDA) calculation to gain insight into enhanced scattering of these thin silica coated Ag NP assemblies. These NPs were successfully applied to detect aniline in river water and tap water. Results suggest that SiO2@Ag@SiO2 NP-based SERS detection systems can be used as a simple and universal detection tool for environment pollutants and food safety. PMID:28570633

  16. Analytical design of a hyper-spectral imaging spectrometer utilizing a convex grating

    NASA Astrophysics Data System (ADS)

    Kim, Seo H.; Kong, Hong J.; Ku, Hana; Lee, Jun H.

    2012-09-01

    This paper describes about the new design method for hyper-spectral Imaging spectrometers utilizing convex grating. Hyper-spectral imaging systems are power tools in the field of remote sensing. HSI systems collect at least 100 spectral bands of 10~20 nm width. Because the spectral signature is different and induced unique for each material, it should be possible to discriminate between one material and another based on difference in spectral signature of material. I mathematically analyzed parameters for the intellectual initial design. Main concept of this is the derivative of "ring of minimum aberration without vignetting". This work is a kind of analytical design of an Offner imaging spectrometer. Also, several experiment methods will be contrived to evaluate the performance of imaging spectrometer.

  17. Imaging MALDI MS of Dosed Brain Tissues Utilizing an Alternative Analyte Pre-extraction Approach

    NASA Astrophysics Data System (ADS)

    Quiason, Cristine M.; Shahidi-Latham, Sheerin K.

    2015-06-01

    Matrix-assisted laser desorption ionization (MALDI) imaging mass spectrometry has been adopted in the pharmaceutical industry as a useful tool to detect xenobiotic distribution within tissues. A unique sample preparation approach for MALDI imaging has been described here for the extraction and detection of cobimetinib and clozapine, which were previously undetectable in mouse and rat brain using a single matrix application step. Employing a combination of a buffer wash and a cyclohexane pre-extraction step prior to standard matrix application, the xenobiotics were successfully extracted and detected with an 8 to 20-fold gain in sensitivity. This alternative approach for sample preparation could serve as an advantageous option when encountering difficult to detect analytes.

  18. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  19. Electrochemical lectin based biosensors as a label-free tool in glycomics

    PubMed Central

    Bertók, Tomáš; Katrlík, Jaroslav; Gemeiner, Peter; Tkac, Jan

    2016-01-01

    Glycans and other saccharide moieties attached to proteins and lipids, or present on the surface of a cell, are actively involved in numerous physiological or pathological processes. Their structural flexibility (that is based on the formation of various kinds of linkages between saccharides) is making glycans superb “identity cards”. In fact, glycans can form more “words” or “codes” (i.e., unique sequences) from the same number of “letters” (building blocks) than DNA or proteins. Glycans are physicochemically similar and it is not a trivial task to identify their sequence, or - even more challenging - to link a given glycan to a particular physiological or pathological process. Lectins can recognise differences in glycan compositions even in their bound state and therefore are most useful tools in the task to decipher the “glycocode”. Thus, lectin-based biosensors working in a label-free mode can effectively complement the current weaponry of analytical tools in glycomics. This review gives an introduction into the area of glycomics and then focuses on the design, analytical performance, and practical utility of lectin-based electrochemical label-free biosensors for the detection of isolated glycoproteins or intact cells. PMID:27239071

  20. Heterogeneous postsurgical data analytics for predictive modeling of mortality risks in intensive care units.

    PubMed

    Yun Chen; Hui Yang

    2014-01-01

    The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.

  1. Surface-enhanced Raman spectroscopy for the detection of pathogenic DNA and protein in foods

    NASA Astrophysics Data System (ADS)

    Chowdhury, Mustafa H.; Atkinson, Brad; Good, Theresa; Cote, Gerard L.

    2003-07-01

    Traditional Raman spectroscopy while extremely sensitive to structure and conformation, is an ineffective tool for the detection of bioanalytes at the sub milimolar level. Surface Enhanced Raman Spectroscopy (SERS) is a technique developed more recently that has been used with applaudable success to enhance the Raman cross-section of a molecule by factors of 106 to 1014. This technique can be exploited in a nanoscale biosensor for the detection of pathogenic proteins and DNA in foods by using a biorecognition molecule to bring a target analyte in close proximity to the mental surface. This is expected to produce a SERS signal of the target analyte, thus making it possible to easily discriminate between the target analyte and possible confounders. In order for the sensor to be effective, the Raman spectra of the target analyte would have to be distinct from that of the biorecognition molecule, as both would be in close proximity to the metal surface and thus be subjected to the SERS effect. In our preliminary studies we have successfully used citrate reduced silver colloidal particles to obtain unique SERS spectra of α-helical and β-sheet bovine serum albumin (BSA) that served as models of an α helical antiobiody (biorecognition element) and a β-sheet target protein (pathogenic prion). In addition, the unique SERS spectra of double stranded and single stranded DNA were also obtained where the single stranded DNA served as the model for the biorecognition element and the double stranded DNA served as themodel for the DNA probe/target hybrid. This provides a confirmation of the feasibility of the method which opens opportunities for potentially wide spread applications in the detection of food pathogens, biowarefare agents, andother bio-analytes.

  2. Analytical modelling of Halbach linear generator incorporating pole shifting and piece-wise spring for ocean wave energy harvesting

    NASA Astrophysics Data System (ADS)

    Tan, Yimin; Lin, Kejian; Zu, Jean W.

    2018-05-01

    Halbach permanent magnet (PM) array has attracted tremendous research attention in the development of electromagnetic generators for its unique properties. This paper has proposed a generalized analytical model for linear generators. The slotted stator pole-shifting and implementation of Halbach array have been combined for the first time. Initially, the magnetization components of the Halbach array have been determined using Fourier decomposition. Then, based on the magnetic scalar potential method, the magnetic field distribution has been derived employing specially treated boundary conditions. FEM analysis has been conducted to verify the analytical model. A slotted linear PM generator with Halbach PM has been constructed to validate the model and further improved using piece-wise springs to trigger full range reciprocating motion. A dynamic model has been developed to characterize the dynamic behavior of the slider. This analytical method provides an effective tool in development and optimization of Halbach PM generator. The experimental results indicate that piece-wise springs can be employed to improve generator performance under low excitation frequency.

  3. Singular value decomposition for the truncated Hilbert transform

    NASA Astrophysics Data System (ADS)

    Katsevich, A.

    2010-11-01

    Starting from a breakthrough result by Gelfand and Graev, inversion of the Hilbert transform became a very important tool for image reconstruction in tomography. In particular, their result is useful when the tomographic data are truncated and one deals with an interior problem. As was established recently, the interior problem admits a stable and unique solution when some a priori information about the object being scanned is available. The most common approach to solving the interior problem is based on converting it to the Hilbert transform and performing analytic continuation. Depending on what type of tomographic data are available, one gets different Hilbert inversion problems. In this paper, we consider two such problems and establish singular value decomposition for the operators involved. We also propose algorithms for performing analytic continuation.

  4. Mining Health-Related Issues in Consumer Product Reviews by Using Scalable Text Analytics

    PubMed Central

    Torii, Manabu; Tilak, Sameer S.; Doan, Son; Zisook, Daniel S.; Fan, Jung-wei

    2016-01-01

    In an era when most of our life activities are digitized and recorded, opportunities abound to gain insights about population health. Online product reviews present a unique data source that is currently underexplored. Health-related information, although scarce, can be systematically mined in online product reviews. Leveraging natural language processing and machine learning tools, we were able to mine 1.3 million grocery product reviews for health-related information. The objectives of the study were as follows: (1) conduct quantitative and qualitative analysis on the types of health issues found in consumer product reviews; (2) develop a machine learning classifier to detect reviews that contain health-related issues; and (3) gain insights about the task characteristics and challenges for text analytics to guide future research. PMID:27375358

  5. Mining Health-Related Issues in Consumer Product Reviews by Using Scalable Text Analytics.

    PubMed

    Torii, Manabu; Tilak, Sameer S; Doan, Son; Zisook, Daniel S; Fan, Jung-Wei

    2016-01-01

    In an era when most of our life activities are digitized and recorded, opportunities abound to gain insights about population health. Online product reviews present a unique data source that is currently underexplored. Health-related information, although scarce, can be systematically mined in online product reviews. Leveraging natural language processing and machine learning tools, we were able to mine 1.3 million grocery product reviews for health-related information. The objectives of the study were as follows: (1) conduct quantitative and qualitative analysis on the types of health issues found in consumer product reviews; (2) develop a machine learning classifier to detect reviews that contain health-related issues; and (3) gain insights about the task characteristics and challenges for text analytics to guide future research.

  6. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  7. Scattering from phase-separated vesicles. I. An analytical form factor for multiple static domains

    DOE PAGES

    Heberle, Frederick A.; Anghel, Vinicius N. P.; Katsaras, John

    2015-08-18

    This is the first in a series of studies considering elastic scattering from laterally heterogeneous lipid vesicles containing multiple domains. Unique among biophysical tools, small-angle neutron scattering can in principle give detailed information about the size, shape and spatial arrangement of domains. A general theory for scattering from laterally heterogeneous vesicles is presented, and the analytical form factor for static domains with arbitrary spatial configuration is derived, including a simplification for uniformly sized round domains. The validity of the model, including series truncation effects, is assessed by comparison with simulated data obtained from a Monte Carlo method. Several aspects ofmore » the analytical solution for scattering intensity are discussed in the context of small-angle neutron scattering data, including the effect of varying domain size and number, as well as solvent contrast. Finally, the analysis indicates that effects of domain formation are most pronounced when the vesicle's average scattering length density matches that of the surrounding solvent.« less

  8. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Optical Micromachining

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) with Marshall Space Flight Center, Potomac Photonics, Inc., constructed and demonstrated a unique tool that fills a need in the area of diffractive and refractive micro-optics. It is an integrated computer-aided design and computer-aided micro-machining workstation that will extend the benefits of diffractive and micro-optic technology to optical designers. Applications of diffractive optics include sensors and monitoring equipment, analytical instruments, and fiber optic distribution and communication. The company has been making diffractive elements with the system as a commercial service for the last year.

  10. Individual Human Cell Responses to Low Doses of Chemicals and Radiation Studied by Synchrotron Infrared Spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Martin, Michael C.; Holman, Hoi-Ying N.; Blakely, Eleanor A.; Goth-Goldstein, Regine; McKinney, Wayne R.

    2000-03-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR FTIR microscopy probes intact living cells providing a composite view of all of the molecular responses and the ability to monitor the responses over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low-doses of radiation and chemicals. In this study we used high spatial-resolution SR FTIR vibrational spectromicroscopy at ALS Beamline 1.4.3 as a sensitive analytical tool to detect chemical- and radiation-induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of oxidative stresses: bleomycin, hydrogen peroxide, and X-rays. We observe spectral changes that are unique to each exogenous stressor. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio-compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  11. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures.

    PubMed

    Montgomery, L D; Montgomery, R W; Guisado, R

    1995-05-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  12. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures

    NASA Technical Reports Server (NTRS)

    Montgomery, L. D.; Montgomery, R. W.; Guisado, R.

    1995-01-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  13. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Deciphering Phosphotyrosine-Dependent Signaling Networks in Cancer by SH2 Profiling

    PubMed Central

    Machida, Kazuya; Khenkhar, Malik

    2012-01-01

    It has been a decade since the introduction of SH2 profiling, a modular domain-based molecular diagnostics tool. This review covers the original concept of SH2 profiling, different analytical platforms, and their applications, from the detailed analysis of single proteins to broad screening in translational research. Illustrated by practical examples, we discuss the uniqueness and advantages of the approach as well as its limitations and challenges. We provide guidance for basic researchers and oncologists who may consider SH2 profiling in their respective cancer research, especially for those focusing on tyrosine phosphoproteomics. SH2 profiling can serve as an alternative phosphoproteomics tool to dissect aberrant tyrosine kinase pathways responsible for individual malignancies, with the goal of facilitating personalized diagnostics for the treatment of cancer. PMID:23226573

  15. Visualizing the BEC-BCS crossover in a two-dimensional Fermi gas: Pairing gaps and dynamical response functions from ab initio computations

    NASA Astrophysics Data System (ADS)

    Vitali, Ettore; Shi, Hao; Qin, Mingpu; Zhang, Shiwei

    2017-12-01

    Experiments with ultracold atoms provide a highly controllable laboratory setting with many unique opportunities for precision exploration of quantum many-body phenomena. The nature of such systems, with strong interaction and quantum entanglement, makes reliable theoretical calculations challenging. Especially difficult are excitation and dynamical properties, which are often the most directly relevant to experiment. We carry out exact numerical calculations, by Monte Carlo sampling of imaginary-time propagation of Slater determinants, to compute the pairing gap in the two-dimensional Fermi gas from first principles. Applying state-of-the-art analytic continuation techniques, we obtain the spectral function and the density and spin structure factors providing unique tools to visualize the BEC-BCS crossover. These quantities will allow for a direct comparison with experiments.

  16. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    PubMed

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  17. FAST TRACK COMMUNICATION: Uniqueness of static black holes without analyticity

    NASA Astrophysics Data System (ADS)

    Chruściel, Piotr T.; Galloway, Gregory J.

    2010-08-01

    We show that the hypothesis of analyticity in the uniqueness theory of vacuum, or electrovacuum, static black holes is not needed. More generally, we show that prehorizons covering a closed set cannot occur in well-behaved domains of outer communications.

  18. Protein-centric N-glycoproteomics analysis of membrane and plasma membrane proteins.

    PubMed

    Sun, Bingyun; Hood, Leroy

    2014-06-06

    The advent of proteomics technology has transformed our understanding of biological membranes. The challenges for studying membrane proteins have inspired the development of many analytical and bioanalytical tools, and the techniques of glycoproteomics have emerged as an effective means to enrich and characterize membrane and plasma-membrane proteomes. This Review summarizes the development of various glycoproteomics techniques to overcome the hurdles formed by the unique structures and behaviors of membrane proteins with a focus on N-glycoproteomics. Example contributions of N-glycoproteomics to the understanding of membrane biology are provided, and the areas that require future technical breakthroughs are discussed.

  19. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  20. A new X-ray fluorescence spectroscopy for extraterrestrial materials using a muon beam

    PubMed Central

    Terada, K.; Ninomiya, K.; Osawa, T.; Tachibana, S.; Miyake, Y.; Kubo, M. K.; Kawamura, N.; Higemoto, W.; Tsuchiyama, A.; Ebihara, M.; Uesugi, M.

    2014-01-01

    The recent development of the intense pulsed muon source at J-PARC MUSE, Japan Proton Accelerator Research Complex/MUon Science Establishment (106 s−1 for a momentum of 60 MeV/c), enabled us to pioneer a new frontier in analytical sciences. Here, we report a non-destructive elemental analysis using µ− capture. Controlling muon momentum from 32.5 to 57.5 MeV/c, we successfully demonstrate a depth-profile analysis of light elements (B, C, N, and O) from several mm-thick layered materials and non-destructive bulk analyses of meteorites containing organic materials. Muon beam analysis, enabling a bulk analysis of light to heavy elements without severe radioactivation, is a unique analytical method complementary to other non-destructive analyses. Furthermore, this technology can be used as a powerful tool to identify the content and distribution of organic components in future asteroidal return samples. PMID:24861282

  1. Visual analysis of large heterogeneous social networks by semantic and structural abstraction.

    PubMed

    Shen, Zeqian; Ma, Kwan-Liu; Eliassi-Rad, Tina

    2006-01-01

    Social network analysis is an active area of study beyond sociology. It uncovers the invisible relationships between actors in a network and provides understanding of social processes and behaviors. It has become an important technique in a variety of application areas such as the Web, organizational studies, and homeland security. This paper presents a visual analytics tool, OntoVis, for understanding large, heterogeneous social networks, in which nodes and links could represent different concepts and relations, respectively. These concepts and relations are related through an ontology (also known as a schema). OntoVis is named such because it uses information in the ontology associated with a social network to semantically prune a large, heterogeneous network. In addition to semantic abstraction, OntoVis also allows users to do structural abstraction and importance filtering to make large networks manageable and to facilitate analytic reasoning. All these unique capabilities of OntoVis are illustrated with several case studies.

  2. A new X-ray fluorescence spectroscopy for extraterrestrial materials using a muon beam.

    PubMed

    Terada, K; Ninomiya, K; Osawa, T; Tachibana, S; Miyake, Y; Kubo, M K; Kawamura, N; Higemoto, W; Tsuchiyama, A; Ebihara, M; Uesugi, M

    2014-05-27

    The recent development of the intense pulsed muon source at J-PARC MUSE, Japan Proton Accelerator Research Complex/MUon Science Establishment (10(6) s(-1) for a momentum of 60 MeV/c), enabled us to pioneer a new frontier in analytical sciences. Here, we report a non-destructive elemental analysis using µ(-) capture. Controlling muon momentum from 32.5 to 57.5 MeV/c, we successfully demonstrate a depth-profile analysis of light elements (B, C, N, and O) from several mm-thick layered materials and non-destructive bulk analyses of meteorites containing organic materials. Muon beam analysis, enabling a bulk analysis of light to heavy elements without severe radioactivation, is a unique analytical method complementary to other non-destructive analyses. Furthermore, this technology can be used as a powerful tool to identify the content and distribution of organic components in future asteroidal return samples.

  3. First GIS Analysis of Modern Stone Tools Used by Wild Chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    PubMed Central

    Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio

    2015-01-01

    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642

  4. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  5. Toxic genes present a unique phylogenetic signature.

    PubMed

    Avni, Eliran; Snir, Sagi

    2017-11-01

    Horizontal gene transfer (HGT) is a major part of the evolution of Archaea and Bacteria, to the extent that the validity of the Tree of Life concept for prokaryotes has been seriously questioned. The patterns and routes of HGT remain a subject of intense study and debate. It was discovered that while several genes exhibit rampant HGT across the whole prokaryotic tree of life, others are lethal to certain organisms and therefore cannot be successfully transferred to them. We distinguish between these two classes of genes and show analytically that genes found to be toxic to a specific species (E. coli) also resist HGT in general. Several tools we employ show evidence to support that claim. One of those tools is the quartet plurality distribution (QPD), a mathematical tool that measures tendency to HGT over a large set of genes and species. When aggregated over a collection of genes, it can reveal important properties of this collection. We conclude that evidence of toxicity of certain genes to a wide variety of prokaryotes are revealed using the new tool of quartet plurality distribution. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Microchip-Based Single-Cell Functional Proteomics for Biomedical Applications

    PubMed Central

    Lu, Yao; Yang, Liu; Wei, Wei; Shi, Qihui

    2017-01-01

    Cellular heterogeneity has been widely recognized but only recently have single cell tools become available that allow characterizing heterogeneity at the genomic and proteomic levels. We review the technological advances in microchip-based toolkits for single-cell functional proteomics. Each of these tools has distinct advantages and limitations, and a few have advanced toward being applied to address biological or clinical problems that fail to be addressed by traditional population-based methods. High-throughput single-cell proteomic assays generate high-dimensional data sets that contain new information and thus require developing new analytical framework to extract new biology. In this review article, we highlight a few biological and clinical applications in which the microchip-based single-cell proteomic tools provide unique advantages. The examples include resolving functional heterogeneity and dynamics of immune cells, dissecting cell-cell interaction by creating well-contolled on-chip microenvironment, capturing high-resolution snapshots of immune system functions in patients for better immunotherapy and elucidating phosphoprotein signaling networks in cancer cells for guiding effective molecularly targeted therapies. PMID:28280819

  7. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  8. Trends in access of plant biodiversity data revealed by Google Analytics

    PubMed Central

    Baxter, David G.; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E.

    2014-01-01

    Abstract The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development. PMID:25425933

  9. Trends in access of plant biodiversity data revealed by Google Analytics.

    PubMed

    Jones, Timothy Mark; Baxter, David G; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E

    2014-01-01

    The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development.

  10. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  11. Differentiation of Illusory and True Halo in Writing Scores

    ERIC Educational Resources Information Center

    Lai, Emily R.; Wolfe, Edward W.; Vickers, Daisy

    2015-01-01

    This report summarizes an empirical study that addresses two related topics within the context of writing assessment--illusory halo and how much unique information is provided by multiple analytic scores. Specifically, we address the issue of whether unique information is provided by analytic scores assigned to student writing, beyond what is…

  12. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  13. A comparison of 1D analytical model and 3D finite element analysis with experiments for a rosen-type piezoelectric transformer.

    PubMed

    Boukazouha, F; Poulin-Vittrant, G; Tran-Huu-Hue, L P; Bavencoffe, M; Boubenider, F; Rguiti, M; Lethiecq, M

    2015-07-01

    This article is dedicated to the study of Piezoelectric Transformers (PTs), which offer promising solutions to the increasing need for integrated power electronics modules within autonomous systems. The advantages offered by such transformers include: immunity to electromagnetic disturbances; ease of miniaturisation for example, using conventional micro fabrication processes; and enhanced performance in terms of voltage gain and power efficiency. Central to the adequate description of such transformers is the need for complex analytical modeling tools, especially if one is attempting to include combined contributions due to (i) mechanical phenomena owing to the different propagation modes which differ at the primary and secondary sides of the PT; and (ii) electrical phenomena such as the voltage gain and power efficiency, which depend on the electrical load. The present work demonstrates an original one-dimensional (1D) analytical model, dedicated to a Rosen-type PT and simulation results are successively compared against that of a three-dimensional (3D) Finite Element Analysis (COMSOL Multiphysics software) and experimental results. The Rosen-type PT studied here is based on a single layer soft PZT (P191) with corresponding dimensions 18 mm × 3 mm × 1.5 mm, which operated at the second harmonic of 176 kHz. Detailed simulational and experimental results show that the presented 1D model predicts experimental measurements to within less than 10% error of the voltage gain at the second and third resonance frequency modes. Adjustment of the analytical model parameters is found to decrease errors relative to experimental voltage gain to within 1%, whilst a 2.5% error on the output admittance magnitude at the second resonance mode were obtained. Relying on the unique assumption of one-dimensionality, the present analytical model appears as a useful tool for Rosen-type PT design and behavior understanding. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  15. An analysis of the influence of production conditions on the development of the microporous structure of the activated carbon fibres using the LBET method

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Mirosław

    2017-12-01

    The paper presents the results of the research on the application of the new analytical models of multilayer adsorption on heterogeneous surfaces with the unique fast multivariant identification procedure, together called LBET method, as a tool for analysing the microporous structure of the activated carbon fibres obtained from polyacrylonitrile by chemical activation using potassium and sodium hydroxides. The novel LBET method was employed particularly to evaluate the impact of the used activator and the hydroxide to polyacrylonitrile ratio on the obtained microporous structure of the activated carbon fibres.

  16. Capillary electrophoresis for the analysis of contaminants in emerging food safety issues and food traceability.

    PubMed

    Vallejo-Cordoba, Belinda; González-Córdova, Aarón F

    2010-07-01

    This review presents an overview of the applicability of CE in the analysis of chemical and biological contaminants involved in emerging food safety issues. Additionally, CE-based genetic analyzers' usefulness as a unique tool in food traceability verification systems was presented. First, analytical approaches for the determination of melamine and specific food allergens in different foods were discussed. Second, natural toxin analysis by CE was updated from the last review reported in 2008. Finally, the analysis of prion proteins associated with the "mad cow" crises and the application of CE-based genetic analyzers for meat traceability were summarized.

  17. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  18. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  19. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  20. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  1. MycoCosm, an Integrated Fungal Genomics Resource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabalov, Igor; Grigoriev, Igor

    2012-03-16

    MycoCosm is a web-based interactive fungal genomics resource, which was first released in March 2010, in response to an urgent call from the fungal community for integration of all fungal genomes and analytical tools in one place (Pan-fungal data resources meeting, Feb 21-22, 2010, Alexandria, VA). MycoCosm integrates genomics data and analysis tools to navigate through over 100 fungal genomes sequenced at JGI and elsewhere. This resource allows users to explore fungal genomes in the context of both genome-centric analysis and comparative genomics, and promotes user community participation in data submission, annotation and analysis. MycoCosm has over 4500 unique visitors/monthmore » or 35000+ visitors/year as well as hundreds of registered users contributing their data and expertise to this resource. Its scalable architecture allows significant expansion of the data expected from JGI Fungal Genomics Program, its users, and integration with external resources used by fungal community.« less

  2. Inferring subunit stoichiometry from single molecule photobleaching

    PubMed Central

    2013-01-01

    Single molecule photobleaching is a powerful tool for determining the stoichiometry of protein complexes. By attaching fluorophores to proteins of interest, the number of associated subunits in a complex can be deduced by imaging single molecules and counting fluorophore photobleaching steps. Because some bleaching steps might be unobserved, the ensemble of steps will be binomially distributed. In this work, it is shown that inferring the true composition of a complex from such data is nontrivial because binomially distributed observations present an ill-posed inference problem. That is, a unique and optimal estimate of the relevant parameters cannot be extracted from the observations. Because of this, a method has not been firmly established to quantify confidence when using this technique. This paper presents a general inference model for interpreting such data and provides methods for accurately estimating parameter confidence. The formalization and methods presented here provide a rigorous analytical basis for this pervasive experimental tool. PMID:23712552

  3. In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions

    NASA Astrophysics Data System (ADS)

    Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh

    2005-09-01

    This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.

  4. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  5. Perspectives on making big data analytics work for oncology.

    PubMed

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from incorporating prior knowledge, using information-theoretic techniques to modern ensemble machine learning approaches or combination of these. We will particularly discuss the pros and cons of different approaches to improve mining of big data in oncology. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Development of a new semi-analytical model for cross-borehole flow experiments in fractured media

    USGS Publications Warehouse

    Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.

    2015-01-01

    Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.

  7. Analytical Utility of Mass Spectral Binning in Proteomic Experiments by SPectral Immonium Ion Detection (SPIID)*

    PubMed Central

    Kelstrup, Christian D.; Frese, Christian; Heck, Albert J. R.; Olsen, Jesper V.; Nielsen, Michael L.

    2014-01-01

    Unambiguous identification of tandem mass spectra is a cornerstone in mass-spectrometry-based proteomics. As the study of post-translational modifications (PTMs) by means of shotgun proteomics progresses in depth and coverage, the ability to correctly identify PTM-bearing peptides is essential, increasing the demand for advanced data interpretation. Several PTMs are known to generate unique fragment ions during tandem mass spectrometry, the so-called diagnostic ions, which unequivocally identify a given mass spectrum as related to a specific PTM. Although such ions offer tremendous analytical advantages, algorithms to decipher MS/MS spectra for the presence of diagnostic ions in an unbiased manner are currently lacking. Here, we present a systematic spectral-pattern-based approach for the discovery of diagnostic ions and new fragmentation mechanisms in shotgun proteomics datasets. The developed software tool is designed to analyze large sets of high-resolution peptide fragmentation spectra independent of the fragmentation method, instrument type, or protease employed. To benchmark the software tool, we analyzed large higher-energy collisional activation dissociation datasets of samples containing phosphorylation, ubiquitylation, SUMOylation, formylation, and lysine acetylation. Using the developed software tool, we were able to identify known diagnostic ions by comparing histograms of modified and unmodified peptide spectra. Because the investigated tandem mass spectra data were acquired with high mass accuracy, unambiguous interpretation and determination of the chemical composition for the majority of detected fragment ions was feasible. Collectively we present a freely available software tool that allows for comprehensive and automatic analysis of analogous product ions in tandem mass spectra and systematic mapping of fragmentation mechanisms related to common amino acids. PMID:24895383

  8. Mid-Infrared Sensing of Organic Pollutants in Aqueous Environments

    PubMed Central

    Pejcic, Bobby; Myers, Matthew; Ross, Andrew

    2009-01-01

    The development of chemical sensors for monitoring the levels of organic pollutants in the aquatic environment has received a great deal of attention in recent decades. In particular, the mid-infrared (MIR) sensor based on attenuated total reflectance (ATR) is a promising analytical tool that has been used to detect a variety of hydrocarbon compounds (i.e., aromatics, alkyl halides, phenols, etc.) dissolved in water. It has been shown that under certain conditions the MIR-ATR sensor is capable of achieving detection limits in the 10–100 ppb concentration range. Since the infrared spectral features of every single organic molecule are unique, the sensor is highly selective, making it possible to distinguish between many different analytes simultaneously. This review paper discusses some of the parameters (i.e., membrane type, film thickness, conditioning) that dictate MIR-ATR sensor response. The performance of various chemoselective membranes which are used in the fabrication of the sensor will be evaluated. Some of the challenges associated with long-term environmental monitoring are also discussed. PMID:22454582

  9. Nanopipettes as Monitoring Probes for the Single Living Cell: State of the Art and Future Directions in Molecular Biology.

    PubMed

    Bulbul, Gonca; Chaves, Gepoliano; Olivier, Joseph; Ozel, Rifat Emrah; Pourmand, Nader

    2018-06-06

    Examining the behavior of a single cell within its natural environment is valuable for understanding both the biological processes that control the function of cells and how injury or disease lead to pathological change of their function. Single-cell analysis can reveal information regarding the causes of genetic changes, and it can contribute to studies on the molecular basis of cell transformation and proliferation. By contrast, whole tissue biopsies can only yield information on a statistical average of several processes occurring in a population of different cells. Electrowetting within a nanopipette provides a nanobiopsy platform for the extraction of cellular material from single living cells. Additionally, functionalized nanopipette sensing probes can differentiate analytes based on their size, shape or charge density, making the technology uniquely suited to sensing changes in single-cell dynamics. In this review, we highlight the potential of nanopipette technology as a non-destructive analytical tool to monitor single living cells, with particular attention to integration into applications in molecular biology.

  10. Biosensors and their applications in detection of organophosphorus pesticides in the environment.

    PubMed

    Hassani, Shokoufeh; Momtaz, Saeideh; Vakhshiteh, Faezeh; Maghsoudi, Armin Salek; Ganjali, Mohammad Reza; Norouzi, Parviz; Abdollahi, Mohammad

    2017-01-01

    This review discusses the past and recent advancements of biosensors focusing on detection of organophosphorus pesticides (OPs) due to their exceptional use during the last decades. Apart from agricultural benefits, OPs also impose adverse toxicological effects on animal and human population. Conventional approaches such as chromatographic techniques used for pesticide detection are associated with several limitations. A biosensor technology is unique due to the detection sensitivity, selectivity, remarkable performance capabilities, simplicity and on-site operation, fabrication and incorporation with nanomaterials. This study also provided specifications of the most OPs biosensors reported until today based on their transducer system. In addition, we highlighted the application of advanced complementary materials and analysis techniques in OPs detection systems. The availability of these new materials associated with new sensing techniques has led to introduction of easy-to-use analytical tools of high sensitivity and specificity in the design and construction of OPs biosensors. In this review, we elaborated the achievements in sensing systems concerning innovative nanomaterials and analytical techniques with emphasis on OPs.

  11. Membrane-based lateral flow immunochromatographic strip with nanoparticles as reporters for detection: A review.

    PubMed

    Huang, Xiaolin; Aguilar, Zoraida P; Xu, Hengyi; Lai, Weihua; Xiong, Yonghua

    2016-01-15

    Membrane-based lateral flow immunochromatographic strip (LFICS) is widely used in various fields because of its simplicity, rapidity (detection within 10min), and low cost. However, early designs of membrane-based LFICS for preliminary screening only provide qualitative ("yes/no" signal) or semi-quantitative results without quantitative information. These designs often suffer from low-signal intensity and poor sensitivity and are only capable of single analyte detection, not simultaneous multiple detections. The performance of existing techniques used for detection using LFICS has been considerably improved by incorporating different kinds of nanoparticles (NPs) as reporters. NPs can serve as alternative labels and improve analytical sensitivity or limit of detection of LFICS because of their unique properties, such as optical absorption, fluorescence spectra, and magnetic properties. The controlled manipulation of NPs allows simultaneous or multiple detections by using membrane-based LFICS. In this review, we discuss how colored (e.g., colloidal gold, carbon, and colloidal selenium NPs), luminescent (e.g., quantum dots, up-converting phosphor NPs, and dye-doped NPs), and magnetic NPs are integrated into membrane-based LFICS for the detection of target analytes. Gold NPs are also featured because of their wide applications. Different types and unique properties of NPs are briefly explained. This review focuses on examples of NP-based LFICS to illustrate novel concepts in various devices with potential applications as screening tools. This review also highlights the superiority of NP-based approaches over existing conventional strategies for clinical analysis, food safety, and environmental monitoring. This paper is concluded by a short section on future research trends regarding NP-based LFICS. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  13. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  14. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.

  15. Fungal Genomics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grigoriev, Igor

    The JGI Fungal Genomics Program aims to scale up sequencing and analysis of fungal genomes to explore the diversity of fungi important for energy and the environment, and to promote functional studies on a system level. Combining new sequencing technologies and comparative genomics tools, JGI is now leading the world in fungal genome sequencing and analysis. Over 120 sequenced fungal genomes with analytical tools are available via MycoCosm (www.jgi.doe.gov/fungi), a web-portal for fungal biologists. Our model of interacting with user communities, unique among other sequencing centers, helps organize these communities, improves genome annotation and analysis work, and facilitates new larger-scalemore » genomic projects. This resulted in 20 high-profile papers published in 2011 alone and contributing to the Genomics Encyclopedia of Fungi, which targets fungi related to plant health (symbionts, pathogens, and biocontrol agents) and biorefinery processes (cellulose degradation, sugar fermentation, industrial hosts). Our next grand challenges include larger scale exploration of fungal diversity (1000 fungal genomes), developing molecular tools for DOE-relevant model organisms, and analysis of complex systems and metagenomes.« less

  16. Spectral mapping tools from the earth sciences applied to spectral microscopy data.

    PubMed

    Harris, A Thomas

    2006-08-01

    Spectral imaging, originating from the field of earth remote sensing, is a powerful tool that is being increasingly used in a wide variety of applications for material identification. Several workers have used techniques like linear spectral unmixing (LSU) to discriminate materials in images derived from spectral microscopy. However, many spectral analysis algorithms rely on assumptions that are often violated in microscopy applications. This study explores algorithms originally developed as improvements on early earth imaging techniques that can be easily translated for use with spectral microscopy. To best demonstrate the application of earth remote sensing spectral analysis tools to spectral microscopy data, earth imaging software was used to analyze data acquired with a Leica confocal microscope with mechanical spectral scanning. For this study, spectral training signatures (often referred to as endmembers) were selected with the ENVI (ITT Visual Information Solutions, Boulder, CO) "spectral hourglass" processing flow, a series of tools that use the spectrally over-determined nature of hyperspectral data to find the most spectrally pure (or spectrally unique) pixels within the data set. This set of endmember signatures was then used in the full range of mapping algorithms available in ENVI to determine locations, and in some cases subpixel abundances of endmembers. Mapping and abundance images showed a broad agreement between the spectral analysis algorithms, supported through visual assessment of output classification images and through statistical analysis of the distribution of pixels within each endmember class. The powerful spectral analysis algorithms available in COTS software, the result of decades of research in earth imaging, are easily translated to new sources of spectral data. Although the scale between earth imagery and spectral microscopy is radically different, the problem is the same: mapping material locations and abundances based on unique spectral signatures. (c) 2006 International Society for Analytical Cytology.

  17. Translating Metabolomics to Cardiovascular Biomarkers

    PubMed Central

    Senn, Todd; Hazen, Stanley L.; Tang, W. H. Wilson

    2012-01-01

    Metabolomics is the systematic study of the unique chemical fingerprints of small-molecules, or metabolite profiles, that are related to a variety of cellular metabolic processes in a cell, organ, or organism. While mRNA gene expression data and proteomic analyses do not tell the whole story of what might be happening in a cell, metabolic profiling provides direct and indirect physiologic insights that can potentially be detectable in a wide range of biospecimens. Although not specific to cardiac conditions, translating metabolomics to cardiovascular biomarkers has followed the traditional path of biomarker discovery from identification and confirmation to clinical validation and bedside testing. With technological advances in metabolomic tools (such as nuclear magnetic resonance spectroscopy and mass spectrometry) and more sophisticated bioinformatics and analytical techniques, the ability to measure low-molecular-weight metabolites in biospecimens provides a unique insight into established and novel metabolic pathways. Systemic metabolomics may provide physiologic understanding of cardiovascular disease states beyond traditional profiling, and may involve descriptions of metabolic responses of an individual or population to therapeutic interventions or environmental exposures. PMID:22824112

  18. Analytical characterization of wine and its precursors by capillary electrophoresis.

    PubMed

    Gomez, Federico J V; Monasterio, Romina P; Vargas, Verónica Carolina Soto; Silva, María F

    2012-08-01

    The accurate determination of marker chemical species in grape, musts, and wines presents a unique analytical challenge with high impact on diverse areas of knowledge such as health, plant physiology, and economy. Capillary electromigration techniques have emerged as a powerful tool, allowing the separation and identification of highly polar compounds that cannot be easily separated by traditional HPLC methods, providing complementary information and permitting the simultaneous analysis of analytes with different nature in a single run. The main advantage of CE over traditional methods for wine analysis is that in most cases samples require no treatment other than filtration. The purpose of this article is to present a revision on capillary electromigration methods applied to the analysis of wine and its precursors over the last decade. The current state of the art of the topic is evaluated, with special emphasis on the natural compounds that have allowed wine to be considered as a functional food. The most representative revised compounds are phenolic compounds, amino acids, proteins, elemental species, mycotoxins, and organic acids. Finally, a discussion on future trends of the role of capillary electrophoresis in the field of analytical characterization of wines for routine analysis, wine classification, as well as multidisciplinary aspects of the so-called "from soil to glass" chain is presented. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Standard deviations of composition measurements in atom probe analyses. Part I conventional 1D atom probe.

    PubMed

    Danoix, F; Grancher, G; Bostel, A; Blavette, D

    2007-09-01

    Atom probe is a very powerful instrument to measure concentrations on a sub nanometric scale [M.K. Miller, G.D.W. Smith, Atom Probe Microanalysis, Principles and Applications to Materials Problems, Materials Research Society, Pittsburgh, 1989]. Atom probe is therefore a unique tool to study and characterise finely decomposed metallic materials. Composition profiles or 3D mapping can be realised by gathering elemental composition measurements. As the detector efficiency is generally not equal to 1, the measured compositions are only estimates of actual values. The variance of the estimates depends on which information is to be estimated. It can be calculated when the detection process is known. These two papers are devoted to give complete analytical derivation and expressions of the variance on composition measurements in several situations encountered when using atom probe. In the first paper, we will concentrate on the analytical derivation of the variance when estimation of compositions obtained from a conventional one dimension (1D) atom probe is considered. In particular, the existing expressions, and the basic hypotheses on which they rely, will be reconsidered, and complete analytical demonstrations established. In the second companion paper, the case of 3D atom probe will be treated, highlighting how the knowledge of the 3D position of detected ions modifies the analytical derivation of the variance of local composition data.

  20. An improved multiple flame photometric detector for gas chromatography.

    PubMed

    Clark, Adrian G; Thurbide, Kevin B

    2015-11-20

    An improved multiple flame photometric detector (mFPD) is introduced, based upon interconnecting fluidic channels within a planar stainless steel (SS) plate. Relative to the previous quartz tube mFPD prototype, the SS mFPD provides a 50% reduction in background emission levels, an orthogonal analytical flame, and easier more sensitive operation. As a result, sulfur response in the SS mFPD spans 4 orders of magnitude, yields a minimum detectable limit near 9×10(-12)gS/s, and has a selectivity approaching 10(4) over carbon. The device also exhibits exceptionally large resistance to hydrocarbon response quenching. Additionally, the SS mFPD uniquely allows analyte emission monitoring in the multiple worker flames for the first time. The findings suggest that this mode can potentially further improve upon the analytical flame response of sulfur (both linear HSO, and quadratic S2) and also phosphorus. Of note, the latter is nearly 20-fold stronger in S/N in the collective worker flames response and provides 6 orders of linearity with a detection limit of about 2.0×10(-13)gP/s. Overall, the results indicate that this new SS design notably improves the analytical performance of the mFPD and can provide a versatile and beneficial monitoring tool for gas chromatography. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. ISS Biotechnology Facility - Overview of Analytical Tools for Cellular Biotechnology Investigations

    NASA Technical Reports Server (NTRS)

    Jeevarajan, A. S.; Towe, B. C.; Anderson, M. M.; Gonda, S. R.; Pellis, N. R.

    2001-01-01

    The ISS Biotechnology Facility (BTF) platform provides scientists with a unique opportunity to carry out diverse experiments in a microgravity environment for an extended period of time. Although considerable progress has been made in preserving cells on the ISS for long periods of time for later return to Earth, future biotechnology experiments would desirably monitor, process, and analyze cells in a timely way on-orbit. One aspect of our work has been directed towards developing biochemical sensors for pH, glucose, oxygen, and carbon dioxide for perfused bioreactor system developed at Johnson Space Center. Another aspect is the examination and identification of new and advanced commercial biotechnologies that may have applications to on-orbit experiments.

  2. On the ambiguity of the reaction rate constants in multivariate curve resolution for reversible first-order reaction systems.

    PubMed

    Schröder, Henning; Sawall, Mathias; Kubis, Christoph; Selent, Detlef; Hess, Dieter; Franke, Robert; Börner, Armin; Neymeyr, Klaus

    2016-07-13

    If for a chemical reaction with a known reaction mechanism the concentration profiles are accessible only for certain species, e.g. only for the main product, then often the reaction rate constants cannot uniquely be determined from the concentration data. This is a well-known fact which includes the so-called slow-fast ambiguity. This work combines the question of unique or non-unique reaction rate constants with factor analytic methods of chemometrics. The idea is to reduce the rotational ambiguity of pure component factorizations by considering only those concentration factors which are possible solutions of the kinetic equations for a properly adapted set of reaction rate constants. The resulting set of reaction rate constants corresponds to those solutions of the rate equations which appear as feasible factors in a pure component factorization. The new analysis of the ambiguity of reaction rate constants extends recent research activities on the Area of Feasible Solutions (AFS). The consistency with a given chemical reaction scheme is shown to be a valuable tool in order to reduce the AFS. The new methods are applied to model and experimental data. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Visual programming for next-generation sequencing data analytics.

    PubMed

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  4. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  5. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  6. Is Chemically Synthesized Graphene ‘Really’ a Unique Substrate for SERS and Fluorescence Quenching?

    NASA Astrophysics Data System (ADS)

    Sil, Sanchita; Kuhar, Nikki; Acharya, Somnath; Umapathy, Siva

    2013-11-01

    We demonstrate observation of Raman signals of different analytes adsorbed on carbonaceous materials, such as, chemically reduced graphene, graphene oxide (GO), multi-walled carbon nanotube (MWCNT), graphite and activated carbon. The analytes selected for the study were Rhodamine 6G (R6G) (in resonant conditions), Rhodamine B (RB), Nile blue (NBA), Crystal Violet (CV) and acetaminophen (paracetamol). All the analytes except paracetamol absorb and fluoresce in the visible region. In this article we provide experimental evidence of the fact that observation of Raman signals of analytes on such carbonaceous materials are more due to resonance effect, suppression of fluorescence and efficient adsorption and that this property in not unique to graphene or nanotubes but prevalent for various type of carbon materials.

  7. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Optimum Design of LLC Resonant Converter using Inductance Ratio (Lm/Lr)

    NASA Astrophysics Data System (ADS)

    Palle, Kowstubha; Krishnaveni, K.; Ramesh Reddy, Kolli

    2017-06-01

    The main benefits of LLC resonant dc/dc converter over conventional series and parallel resonant converters are its light load regulation, less circulating currents, larger bandwidth for zero voltage switching, and less tuning of switching frequency for controlled output. An unique analytical tool, called fundamental harmonic approximation with peak gain adjustment is used for designing the converter. In this paper, an optimum design of the converter is proposed by considering three different design criterions with different values of inductance ratio (Lm/Lr) to achieve good efficiency at high input voltage. The optimum design includes the analysis in operating range, switching frequency range, primary side losses of a switch and stability. The analysis is carried out with simulation using the software tools like MATLAB and PSIM. The performance of the optimized design is demonstrated for a design specification of 12 V, 5 A output operating with an input voltage range of 300-400 V using FSFR 2100 IC of Texas instruments.

  9. Global reach of direct-to-consumer advertising using social media for illicit online drug sales.

    PubMed

    Mackey, Tim Ken; Liang, Bryan A

    2013-05-29

    Illicit or rogue Internet pharmacies are a recognized global public health threat that have been identified as utilizing various forms of online marketing and promotion, including social media. To assess the accessibility of creating illicit no prescription direct-to-consumer advertising (DTCA) online pharmacy social media marketing (eDTCA2.0) and evaluate its potential global reach. We identified the top 4 social media platforms allowing eDTCA2.0. After determining applicable platforms (ie, Facebook, Twitter, Google+, and MySpace), we created a fictitious advertisement advertising no prescription drugs online and posted it to the identified social media platforms. Each advertisement linked to a unique website URL that consisted of a site error page. Employing Web search analytics, we tracked the number of users visiting these sites and their location. We used commercially available Internet tools and services, including website hosting, domain registration, and website analytic services. Illicit online pharmacy social media content for Facebook, Twitter, and MySpace remained accessible despite highly questionable and potentially illegal content. Fictitious advertisements promoting illicit sale of drugs generated aggregate unique user traffic of 2795 visits over a 10-month period. Further, traffic to our websites originated from a number of countries, including high-income and middle-income countries, and emerging markets. Our results indicate there are few barriers to entry for social media-based illicit online drug marketing. Further, illicit eDTCA2.0 has globalized outside US borders to other countries through unregulated Internet marketing.

  10. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  11. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  12. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  13. Systems-Level Annotation of a Metabolomics Data Set Reduces 25 000 Features to Fewer than 1000 Unique Metabolites.

    PubMed

    Mahieu, Nathaniel G; Patti, Gary J

    2017-10-03

    When using liquid chromatography/mass spectrometry (LC/MS) to perform untargeted metabolomics, it is now routine to detect tens of thousands of features from biological samples. Poor understanding of the data, however, has complicated interpretation and masked the number of unique metabolites actually being measured in an experiment. Here we place an upper bound on the number of unique metabolites detected in Escherichia coli samples analyzed with one untargeted metabolomics method. We first group multiple features arising from the same analyte, which we call "degenerate features", using a context-driven annotation approach. Surprisingly, this analysis revealed thousands of previously unreported degeneracies that reduced the number of unique analytes to ∼2961. We then applied an orthogonal approach to remove nonbiological features from the data using the 13 C-based credentialing technology. This further reduced the number of unique analytes to less than 1000. Our 90% reduction in data is 5-fold greater than previously published studies. On the basis of the results, we propose an alternative approach to untargeted metabolomics that relies on thoroughly annotated reference data sets. To this end, we introduce the creDBle database ( http://creDBle.wustl.edu ), which contains accurate mass, retention time, and MS/MS fragmentation data as well as annotations of all credentialed features.

  14. Single Molecule Detection in Living Biological Cells using Carbon Nanotube Optical Probes

    NASA Astrophysics Data System (ADS)

    Strano, Michael

    2009-03-01

    Nanoscale sensing elements offer promise for single molecule analyte detection in physically or biologically constrained environments. Molecular adsorption can be amplified via modulation of sharp singularities in the electronic density of states that arise from 1D quantum confinement [1]. Single-walled carbon nanotubes (SWNT), as single molecule optical sensors [2-3], offer unique advantages such as photostable near-infrared (n-IR) emission for prolonged detection through biological media, single-molecule sensitivity and, nearly orthogonal optical modes for signal transduction that can be used to identify distinct classes of analytes. Selective binding to the SWNT surface is difficult to engineer [4]. In this lecture, we will briefly review the immerging field of fluorescent diagnostics using band gap emission from SWNT. In recent work, we demonstrate that even a single pair of SWNT provides at least four optical modes that can be modulated to uniquely fingerprint chemical agents by the degree to which they alter either the emission band intensity or wavelength. We validate this identification method in vitro by demonstrating detection and identification of six genotoxic analytes, including chemotherapeutic drugs and reactive oxygen species (ROS), which are spectroscopically differentiated into four distinct classes. We also demonstrate single-molecule sensitivity in detecting hydrogen peroxide, one of the most common genotoxins and an important cellular signal. Finally, we employ our sensing and fingerprinting method of these analytes in real time within live 3T3 cells, demonstrating the first multiplexed optical detection from a nanoscale biosensor and the first label-free tool to optically discriminate between genotoxins. We will also discuss our recent efforts to fabricate biomedical sensors for real time detection of glucose and other important physiologically relevant analytes in-vivo. The response of embedded SWNT in a swellable hydrogel construct to osmotic pressure gradients will be discussed, as well as its potential as a unique transduction mechanism for a new class of implantable sensors. [4pt] [1] Saito, R., Dresselhaus, G. & Dresselhaus, M. S. Physical Properties of Carbon Nanotubes (Imperial College Press, London, 1998). [0pt] [2] Barone, P. W., Baik, S., Heller, D. A. & Strano, M. S. Near-Infrared Optical Sensors Based on Single-Walled Carbon Nanotubes. Nature Materials 4, 86-92 (2005). [0pt] [3] Jeng, E. S., Moll, A. E., Roy, A. C., Gastala, J. B. & Strano, M. S. Detection of DNA hybridization using the near infrared band-gap fluorescence of single-walled carbon nanotubes. Nano Letters 6, 371-375 (2006). [0pt] [4] Heller, D. A. et al. Optical detection of DNA conformational polymorphism on single-walled carbon nanotubes. Science 311, 508-511 (2006).

  15. RSNA Diagnosis Live: A Novel Web-based Audience Response Tool to Promote Evidence-based Learning.

    PubMed

    Awan, Omer A; Shaikh, Faiq; Kalbfleisch, Brian; Siegel, Eliot L; Chang, Paul

    2017-01-01

    Audience response systems have become more commonplace in radiology residency programs in the last 10 years, as a means to engage learners and promote improved learning and retention. A variety of systems are currently in use. RSNA Diagnosis Live™ provides unique features that are innovative, particularly for radiology resident education. One specific example is the ability to annotate questions with subspecialty tags, which allows resident performance to be tracked over time. In addition, deficiencies in learning can be monitored for each trainee and analytics can be provided, allowing documentation of resident performance improvement. Finally, automated feedback is given not only to the instructor, but also to the trainee. Online supplemental material is available for this article. © RSNA, 2017.

  16. Plasma chemistry as a tool for green chemistry, environmental analysis and waste management.

    PubMed

    Mollah, M Y; Schennach, R; Patscheider, J; Promreuk, S; Cocke, D L

    2000-12-15

    The applications of plasma chemistry to environmental problems and to green chemistry are emerging fields that offer unique opportunities for advancement. There has been substantial progress in the application of plasmas to analytical diagnostics and to waste reduction and waste management. This review discusses the chemistry and physics necessary to a basic understanding of plasmas, something that has been missing from recent technical reviews. The current status of plasmas in environmental chemistry is summarized and emerging areas of application for plasmas are delineated. Plasmas are defined and discussed in terms of their properties that make them useful for environmental chemistry. Information is drawn from diverse fields to illustrate the potential applications of plasmas in analysis, materials modifications and hazardous waste treatments.

  17. Analyzing Water's Optical Absorption

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  18. Mining Mathematics in Textbook Lessons

    ERIC Educational Resources Information Center

    Ronda, Erlina; Adler, Jill

    2017-01-01

    In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…

  19. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  20. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  1. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  2. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    PubMed

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  3. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  4. Analysis of the Yukawa gravitational potential in f (R ) gravity. II. Relativistic periastron advance

    NASA Astrophysics Data System (ADS)

    De Laurentis, Mariafelicia; De Martino, Ivan; Lazkoz, Ruth

    2018-05-01

    Alternative theories of gravity may serve to overcome several shortcomings of the standard cosmological model but, in their weak field limit, general relativity must be recovered so as to match the tight constraints at the Solar System scale. Therefore, testing such alternative models at scales of stellar systems could give a unique opportunity to confirm or rule them out. One of the most straightforward modifications is represented by analytical f (R )-gravity models that introduce a Yukawa-like modification to the Newtonian potential thus modifying the dynamics of particles. Using the geodesics equations, we have illustrated the amplitude of these modifications. First, we have integrated numerically the equations of motion showing the orbital precession of a particle around a massive object. Second, we have computed an analytic expression for the periastron advance of systems having their semimajor axis much shorter than the Yukawa-scale length. Finally, we have extended our results to the case of a binary system composed of two massive objects. Our analysis provides a powerful tool to obtain constraints on the underlying theory of gravity using current and forthcoming data sets.

  5. The FuturICT education accelerator

    NASA Astrophysics Data System (ADS)

    Johnson, J.; Buckingham Shum, S.; Willis, A.; Bishop, S.; Zamenopoulos, T.; Swithenby, S.; MacKay, R.; Merali, Y.; Lorincz, A.; Costea, C.; Bourgine, P.; Louçã, J.; Kapenieks, A.; Kelley, P.; Caird, S.; Bromley, J.; Deakin Crick, R.; Goldspink, C.; Collet, P.; Carbone, A.; Helbing, D.

    2012-11-01

    Education is a major force for economic and social wellbeing. Despite high aspirations, education at all levels can be expensive and ineffective. Three Grand Challenges are identified: (1) enable people to learn orders of magnitude more effectively, (2) enable people to learn at orders of magnitude less cost, and (3) demonstrate success by exemplary interdisciplinary education in complex systems science. A ten year `man-on-the-moon' project is proposed in which FuturICT's unique combination of Complexity, Social and Computing Sciences could provide an urgently needed transdisciplinary language for making sense of educational systems. In close dialogue with educational theory and practice, and grounded in the emerging data science and learning analytics paradigms, this will translate into practical tools (both analytical and computational) for researchers, practitioners and leaders; generative principles for resilient educational ecosystems; and innovation for radically scalable, yet personalised, learner engagement and assessment. The proposed Education Accelerator will serve as a `wind tunnel' for testing these ideas in the context of real educational programmes, with an international virtual campus delivering complex systems education exploiting the new understanding of complex, social, computationally enhanced organisational structure developed within FuturICT.

  6. Development of an analytical microbial consortia method for enhancing performance monitoring at aerobic wastewater treatment plants.

    PubMed

    Razban, Behrooz; Nelson, Kristina Y; McMartin, Dena W; Cullimore, D Roy; Wall, Michelle; Wang, Dunling

    2012-01-01

    An analytical method to produce profiles of bacterial biomass fatty acid methyl esters (FAME) was developed employing rapid agitation followed by static incubation (RASI) using selective media of wastewater microbial communities. The results were compiled to produce a unique library for comparison and performance analysis at a Wastewater Treatment Plant (WWTP). A total of 146 samples from the aerated WWTP, comprising 73 samples of each secondary and tertiary effluent, were included analyzed. For comparison purposes, all samples were evaluated via a similarity index (SI) with secondary effluents producing an SI of 0.88 with 2.7% variation and tertiary samples producing an SI 0.86 with 5.0% variation. The results also highlighted significant differences between the fatty acid profiles of the tertiary and secondary effluents indicating considerable shifts in the bacterial community profile between these treatment phases. The WWTP performance results using this method were highly replicable and reproducible indicating that the protocol has potential as a performance-monitoring tool for aerated WWTPs. The results quickly and accurately reflect shifts in dominant bacterial communities that result when processes operations and performance change.

  7. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  8. Applications of nanopipettes in the analytical sciences.

    PubMed

    Morris, Celeste A; Friedman, Alicia K; Baker, Lane A

    2010-09-01

    In this review, we describe measurements and applications of interest to the analytical community that makes use of simple nanopipettes. Fabricated by applying heat during the separation of a glass capillary, nanopipettes provide a route for nanoscale studies of ion transport and for development of chemical and biochemical sensors. When mounted on a translation stage, nanopipettes also enable unique modes of imaging and material deposition. These facets of nanopipette research, as well as some of the unique properties of nanopipettes, will be discussed.

  9. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  10. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  11. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  12. Blood analytes of oceanic-juvenile loggerhead sea turtles (Caretta caretta) from Azorean waters: reference intervals, size-relevant correlations and comparisons to neritic loggerheads from western Atlantic coastal waters.

    PubMed

    Stacy, Nicole I; Bjorndal, Karen A; Perrault, Justin R; Martins, Helen R; Bolten, Alan B

    2018-01-01

    Blood analyte reference intervals are scarce for immature life stages of the loggerhead sea turtle ( Caretta caretta ). The objectives of this study were to (1) document reference intervals of packed cell volume (PCV) and 20 plasma chemistry analytes from wild oceanic-juvenile stage loggerhead turtles from Azorean waters, (2) investigate correlations with body size (minimum straight carapace length: SCL min ) and (3) compare plasma chemistry data to those from older, larger neritic juveniles (<80 cm SCL min ) and adult loggerheads (≥80 cm SCL min ) that have recruited to the West Atlantic in waters around Cape Canaveral, Florida. Twenty-eight Azorean loggerhead turtles with SCL min of 17.6-60.0 cm (mean 34.9 ± 12.1 cm) were captured, sampled and immediately released. Reference intervals are reported. There were several biologically relevant correlations of blood analytes with SCL min : positive correlations of PCV, proteins and triglycerides with SCL min indicated somatic growth, increasing diving activity and/or diet; negative correlations of tissue enzymes with SCL min suggested faster growth at smaller turtle size, while negative correlations of electrolytes with SCL min indicated differences in diet, environmental conditions and/or osmoregulation unique to the geographic location. Comparisons of loggerhead turtles from the Azores (i.e. oceanic) and Cape Canaveral (i.e. neritic) identified significant differences regarding diet, somatic growth, and/or environment: in Azorean turtles, albumin, triglycerides and bilirubin increased with SCL min , while alkaline phosphatase, lactate dehydrogenase and sodium decreased. In larger neritic Cape Canaveral turtles, aspartate aminotransferase increased with SCL min , while the albumin:globulin ratio, phosphorus and cholesterol decreased. These differences suggest unique physiological disparities between life stage development and migration, reflecting biological and habitat differences between the two populations. This information presents biologically important data that is applicable to stranded individual turtles and to the population level, a tool for the development of conservation strategies, and a baseline for future temporal and spatial investigations of the Azorean loggerhead sea turtle population.

  13. Blood analytes of oceanic-juvenile loggerhead sea turtles (Caretta caretta) from Azorean waters: reference intervals, size-relevant correlations and comparisons to neritic loggerheads from western Atlantic coastal waters

    PubMed Central

    Bjorndal, Karen A; Perrault, Justin R; Martins, Helen R; Bolten, Alan B

    2018-01-01

    Abstract Blood analyte reference intervals are scarce for immature life stages of the loggerhead sea turtle (Caretta caretta). The objectives of this study were to (1) document reference intervals of packed cell volume (PCV) and 20 plasma chemistry analytes from wild oceanic-juvenile stage loggerhead turtles from Azorean waters, (2) investigate correlations with body size (minimum straight carapace length: SCLmin) and (3) compare plasma chemistry data to those from older, larger neritic juveniles (<80 cm SCLmin) and adult loggerheads (≥80 cm SCLmin) that have recruited to the West Atlantic in waters around Cape Canaveral, Florida. Twenty-eight Azorean loggerhead turtles with SCLmin of 17.6–60.0 cm (mean 34.9 ± 12.1 cm) were captured, sampled and immediately released. Reference intervals are reported. There were several biologically relevant correlations of blood analytes with SCLmin: positive correlations of PCV, proteins and triglycerides with SCLmin indicated somatic growth, increasing diving activity and/or diet; negative correlations of tissue enzymes with SCLmin suggested faster growth at smaller turtle size, while negative correlations of electrolytes with SCLmin indicated differences in diet, environmental conditions and/or osmoregulation unique to the geographic location. Comparisons of loggerhead turtles from the Azores (i.e. oceanic) and Cape Canaveral (i.e. neritic) identified significant differences regarding diet, somatic growth, and/or environment: in Azorean turtles, albumin, triglycerides and bilirubin increased with SCLmin, while alkaline phosphatase, lactate dehydrogenase and sodium decreased. In larger neritic Cape Canaveral turtles, aspartate aminotransferase increased with SCLmin, while the albumin:globulin ratio, phosphorus and cholesterol decreased. These differences suggest unique physiological disparities between life stage development and migration, reflecting biological and habitat differences between the two populations. This information presents biologically important data that is applicable to stranded individual turtles and to the population level, a tool for the development of conservation strategies, and a baseline for future temporal and spatial investigations of the Azorean loggerhead sea turtle population. PMID:29479433

  14. Molecularly imprinted sol-gel nanofibers based solid phase microextraction coupled on-line with high performance liquid chromatography for selective determination of acesulfame.

    PubMed

    Moein, Mohammad Mahdi; Javanbakht, Mehran; Karimi, Mohammad; Akbari-Adergani, Behrouz

    2015-03-01

    Sol-gel based molecularly imprinted polymer (MIP) nanofiber was successfully fabricated by electrospinning technique on the surface of a stainless steel bar. The manufactured tool was applied for on-line selective solid phase microextraction (SPME) and determination of acesulfame (ACF) as an artificial sweetener with high performance liquid chromatography (HPLC). The selective ability of method for the extraction of ACF was investigated in the presence of some selected sweeteners such as saccharine (SCH), aspartame (ASP) and caffeine (CAF). Electrospinning of MIP sol-gel solution on the stainless steel bar provided an unbreakable sorbent with high thermal, mechanical, and chemical stability. Moreover, application of the MIP-SPME tool revealed a unique approach for the selective microextraction of the analyte in beverage samples. In this work, 3-(triethoxysilyl)-propylamine (TMSPA) was chosen as a precursor due to its ability to imprint the analyte by hydrogen bonding, Van der Walls, and dipole-dipole interactions. Nylon 6 was also added as a backbone and support for the precursor in which sol could greatly growth during the sol-gel process and makes the solution electrospinable. Various effective parameters in the extraction efficiency of the MIP-SPME tool such as loading time, flow rate, desorption time, selectivity, and the sample volume were evaluated. The linearity for the ACF in beverage sample was in the range of 0.78-100.5 ng mL(-1). Limit of detection (LOD) and quantification (LOQ) were 0.23 and 0.78 ng mL(-1) respectively. The RSD values (n=5) were all below 3.5%at the 20 ng mL(-1) level. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  16. Development of the biology card sorting task to measure conceptual expertise in biology.

    PubMed

    Smith, Julia I; Combs, Elijah D; Nagami, Paul H; Alto, Valerie M; Goh, Henry G; Gourdet, Muryam A A; Hough, Christina M; Nickell, Ashley E; Peer, Adrian G; Coley, John D; Tanner, Kimberly D

    2013-01-01

    There are widespread aspirations to focus undergraduate biology education on teaching students to think conceptually like biologists; however, there is a dearth of assessment tools designed to measure progress from novice to expert biological conceptual thinking. We present the development of a novel assessment tool, the Biology Card Sorting Task, designed to probe how individuals organize their conceptual knowledge of biology. While modeled on tasks from cognitive psychology, this task is unique in its design to test two hypothesized conceptual frameworks for the organization of biological knowledge: 1) a surface feature organization focused on organism type and 2) a deep feature organization focused on fundamental biological concepts. In this initial investigation of the Biology Card Sorting Task, each of six analytical measures showed statistically significant differences when used to compare the card sorting results of putative biological experts (biology faculty) and novices (non-biology major undergraduates). Consistently, biology faculty appeared to sort based on hypothesized deep features, while non-biology majors appeared to sort based on either surface features or nonhypothesized organizational frameworks. Results suggest that this novel task is robust in distinguishing populations of biology experts and biology novices and may be an adaptable tool for tracking emerging biology conceptual expertise.

  17. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  18. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  19. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  20. Agricultural trade networks and patterns of economic development.

    PubMed

    Shutters, Shade T; Muneepeerakul, Rachata

    2012-01-01

    International trade networks are manifestations of a complex combination of diverse underlying factors, both natural and social. Here we apply social network analytics to the international trade network of agricultural products to better understand the nature of this network and its relation to patterns of international development. Using a network tool known as triadic analysis we develop triad significance profiles for a series of agricultural commodities traded among countries. Results reveal a novel network "superfamily" combining properties of biological information processing networks and human social networks. To better understand this unique network signature, we examine in more detail the degree and triadic distributions within the trade network by country and commodity. Our results show that countries fall into two very distinct classes based on their triadic frequencies. Roughly 165 countries fall into one class while 18, all highly isolated with respect to international agricultural trade, fall into the other. Only Vietnam stands out as a unique case. Finally, we show that as a country becomes less isolated with respect to number of trading partners, the country's triadic signature follows a predictable trajectory that may correspond to a trajectory of development.

  1. Violent Video Game Effects on Aggression, Empathy, and Prosocial Behavior in Eastern and Western Countries: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba

    2010-01-01

    Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…

  2. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  3. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  4. Physiological and Anatomical Visual Analytics (PAVA) Background

    EPA Pesticide Factsheets

    The need to efficiently analyze human chemical disposition data from in vivo studies or in silico PBPK modeling efforts, and to see complex disposition data in a logical manner, has created a unique opportunity for visual analytics applid to PAD.

  5. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  6. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  7. Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.

    DOT National Transportation Integrated Search

    2015-01-01

    The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...

  8. Verification of a SEU model for advanced 1-micron CMOS structures using heavy ions

    NASA Technical Reports Server (NTRS)

    Cable, J. S.; Carter, J. R.; Witteles, A. A.

    1986-01-01

    Modeling and test results are reported for 1 micron CMOS circuits. Analytical predictions are correlated with experimental data, and sensitivities to process and design variations are discussed. Unique features involved in predicting the SEU performance of these devices are described. The results show that the critical charge for upset exhibits a strong dependence on pulse width for very fast devices, and upset predictions must factor in the pulse shape. Acceptable SEU error rates can be achieved for a 1 micron bulk CMOS process. A thin retrograde well provides complete SEU immunity for N channel hits at normal incidence angle. Source interconnect resistance can be important parameter in determining upset rates, and Cf-252 testing can be a valuable tool for cost-effective SEU testing.

  9. Compound Capillary Flows in Complex Containers: Drop Tower Test Results

    NASA Astrophysics Data System (ADS)

    Bolleddula, Daniel A.; Chen, Yongkang; Semerjian, Ben; Tavan, Noël; Weislogel, Mark M.

    2010-10-01

    Drop towers continue to provide unique capabilities to investigate capillary flow phenomena relevant to terrestrial and space-based capillary fluidics applications. In this study certain `capillary rise' flows and the value of drop tower experimental investigations are briefly reviewed. A new analytic solution for flows along planar interior edges is presented. A selection of test cell geometries are then discussed where compound capillary flows occur spontaneously and simultaneously over local and global length scales. Sample experimental results are provided. Tertiary experiments on a family of asymmetric geometries that isolate the global component of such flows are then presented along with a qualitative analysis that may be used to either avoid or exploit such flows. The latter may also serve as a design tool with which to assess the impact of inadvertent container asymmetry.

  10. [Review on the feeding ecology and migration patterns of sharks using stable isotopes].

    PubMed

    Li, Yun-Kai

    2014-09-01

    With the rapidly increasing use of stable isotope analysis (SIA) in ecology, it becomes a powerful tool and complement to traditional methods for investigating the trophic ecology of animals. Sharks play a keystone role in marine food webs as the apex predators and are recently becoming the frontier topic of food web studies and marine conservation because of their unique characteristics of evolution. Recently, SIA has recently been applied to trophic ecology studies of shark species. Here, we reviewed the current applications of SIA in shark species, focusing on available tissues for analyzing, standardized analytical approaches, diet-tissue discrimination factors, diet shift investigation, migration patterns predictions and niche-width analyses, with the aim of getting better understanding of stable-isotope dynamics in shark biology and ecology research.

  11. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  12. Global Reach of Direct-to-Consumer Advertising Using Social Media for Illicit Online Drug Sales

    PubMed Central

    Liang, Bryan A

    2013-01-01

    Background Illicit or rogue Internet pharmacies are a recognized global public health threat that have been identified as utilizing various forms of online marketing and promotion, including social media. Objective To assess the accessibility of creating illicit no prescription direct-to-consumer advertising (DTCA) online pharmacy social media marketing (eDTCA2.0) and evaluate its potential global reach. Methods We identified the top 4 social media platforms allowing eDTCA2.0. After determining applicable platforms (ie, Facebook, Twitter, Google+, and MySpace), we created a fictitious advertisement advertising no prescription drugs online and posted it to the identified social media platforms. Each advertisement linked to a unique website URL that consisted of a site error page. Employing Web search analytics, we tracked the number of users visiting these sites and their location. We used commercially available Internet tools and services, including website hosting, domain registration, and website analytic services. Results Illicit online pharmacy social media content for Facebook, Twitter, and MySpace remained accessible despite highly questionable and potentially illegal content. Fictitious advertisements promoting illicit sale of drugs generated aggregate unique user traffic of 2795 visits over a 10-month period. Further, traffic to our websites originated from a number of countries, including high-income and middle-income countries, and emerging markets. Conclusions Our results indicate there are few barriers to entry for social media–based illicit online drug marketing. Further, illicit eDTCA2.0 has globalized outside US borders to other countries through unregulated Internet marketing. PMID:23718965

  13. Surface modified capillary electrophoresis combined with in solution isoelectric focusing and MALDI-TOF/TOF MS: a gel-free multidimensional electrophoresis approach for proteomic profiling--exemplified on human follicular fluid.

    PubMed

    Hanrieder, Jörg; Zuberovic, Aida; Bergquist, Jonas

    2009-04-24

    Development of miniaturized analytical tools continues to be of great interest to face the challenges in proteomic analysis of complex biological samples such as human body fluids. In the light of these challenges, special emphasis is put on the speed and simplicity of newly designed technological approaches as well as the need for cost efficiency and low sample consumption. In this study, we present an alternative multidimensional bottom-up approach for proteomic profiling for fast, efficient and sensitive protein analysis in complex biological matrices. The presented setup was based on sample pre-fractionation using microscale in solution isoelectric focusing (IEF) followed by tryptic digestion and subsequent capillary electrophoresis (CE) coupled off-line to matrix assisted laser desorption/ionization time of flight tandem mass spectrometry (MALDI TOF MS/MS). For high performance CE-separation, PolyE-323 modified capillaries were applied to minimize analyte-wall interactions. The potential of the analytical setup was demonstrated on human follicular fluid (hFF) representing a typical complex human body fluid with clinical implication. The obtained results show significant identification of 73 unique proteins (identified at 95% significance level), including mostly acute phase proteins but also protein identities that are well known to be extensively involved in follicular development.

  14. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  15. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations.

    PubMed

    Davidson, Scott E; Cui, Jing; Kry, Stephen; Deasy, Joseph O; Ibbott, Geoffrey S; Vicic, Milos; White, R Allen; Followill, David S

    2016-08-01

    A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today's modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data points tested. The model was capable of predicting the depth of the maximum dose within 1 mm. Anthropomorphic phantom benchmark testing of modulated and patterned MLCs treatment plans showed agreement to measurement within 3% in target regions using thermoluminescent dosimeters (TLD). Using radiochromic film normalized to TLD, a gamma criteria of 3% of maximum dose and 2 mm DTA was applied with a pass rate of least 85% in the high dose, high gradient, and low dose regions. Finally, recalculations of patient plans using DPM showed good agreement relative to a commercial TPS when comparing dose volume histograms and 2D dose distributions. A unique analytical source model coupled to the dose planning method Monte Carlo dose calculation code has been modified and validated using basic beam data and anthropomorphic phantom measurement. While this tool can be applied in general use for a particular linac model, specifically it was developed to provide a singular methodology to independently assess treatment plan dose distributions from those clinical institutions participating in National Cancer Institute trials.

  16. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  17. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  18. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  19. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  20. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  2. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  3. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  4. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  5. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  6. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  7. Using Learning Analytics to Support Engagement in Collaborative Writing

    ERIC Educational Resources Information Center

    Liu, Ming; Pardo, Abelardo; Liu, Li

    2017-01-01

    Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…

  8. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  9. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  10. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  11. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  12. Novel Substrates as Sources of Ancient DNA: Prospects and Hurdles

    PubMed Central

    Green, Eleanor Joan

    2017-01-01

    Following the discovery in the late 1980s that hard tissues such as bones and teeth preserve genetic information, the field of ancient DNA analysis has typically concentrated upon these substrates. The onset of high-throughput sequencing, combined with optimized DNA recovery methods, has enabled the analysis of a myriad of ancient species and specimens worldwide, dating back to the Middle Pleistocene. Despite the growing sophistication of analytical techniques, the genetic analysis of substrates other than bone and dentine remain comparatively “novel”. Here, we review analyses of other biological substrates which offer great potential for elucidating phylogenetic relationships, paleoenvironments, and microbial ecosystems including (1) archaeological artifacts and ecofacts; (2) calcified and/or mineralized biological deposits; and (3) biological and cultural archives. We conclude that there is a pressing need for more refined models of DNA preservation and bespoke tools for DNA extraction and analysis to authenticate and maximize the utility of the data obtained. With such tools in place the potential for neglected or underexploited substrates to provide a unique insight into phylogenetics, microbial evolution and evolutionary processes will be realized. PMID:28703741

  13. Molecular Imprinting Technology in Quartz Crystal Microbalance (QCM) Sensors.

    PubMed

    Emir Diltemiz, Sibel; Keçili, Rüstem; Ersöz, Arzu; Say, Rıdvan

    2017-02-24

    Molecularly imprinted polymers (MIPs) as artificial antibodies have received considerable scientific attention in the past years in the field of (bio)sensors since they have unique features that distinguish them from natural antibodies such as robustness, multiple binding sites, low cost, facile preparation and high stability under extreme operation conditions (higher pH and temperature values, etc.). On the other hand, the Quartz Crystal Microbalance (QCM) is an analytical tool based on the measurement of small mass changes on the sensor surface. QCM sensors are practical and convenient monitoring tools because of their specificity, sensitivity, high accuracy, stability and reproducibility. QCM devices are highly suitable for converting the recognition process achieved using MIP-based memories into a sensor signal. Therefore, the combination of a QCM and MIPs as synthetic receptors enhances the sensitivity through MIP process-based multiplexed binding sites using size, 3D-shape and chemical function having molecular memories of the prepared sensor system toward the target compound to be detected. This review aims to highlight and summarize the recent progress and studies in the field of (bio)sensor systems based on QCMs combined with molecular imprinting technology.

  14. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  15. The neXtProt peptide uniqueness checker: a tool for the proteomics community.

    PubMed

    Schaeffer, Mathieu; Gateau, Alain; Teixeira, Daniel; Michel, Pierre-André; Zahn-Zabal, Monique; Lane, Lydie

    2017-11-01

    The neXtProt peptide uniqueness checker allows scientists to define which peptides can be used to validate the existence of human proteins, i.e. map uniquely versus multiply to human protein sequences taking into account isobaric substitutions, alternative splicing and single amino acid variants. The pepx program is available at https://github.com/calipho-sib/pepx and can be launched from the command line or through a cgi web interface. Indexing requires a sequence file in FASTA format. The peptide uniqueness checker tool is freely available on the web at https://www.nextprot.org/tools/peptide-uniqueness-checker and from the neXtProt API at https://api.nextprot.org/. lydie.lane@sib.swiss. © The Author(s) 2017. Published by Oxford University Press.

  16. Might "Unique" Factors Be "Common"? On the Possibility of Indeterminate Common-Unique Covariances

    ERIC Educational Resources Information Center

    Grayson, Dave

    2006-01-01

    The present paper shows that the usual factor analytic structured data dispersion matrix lambda psi lambda' + delta can readily arise from a set of scores y = lambda eta + epsilon, shere the "common" (eta) and "unique" (epsilon) factors have nonzero covariance: gamma = Cov epsilon,eta) is not equal to 0. Implications of this finding are discussed…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routinemore » standard analyses to unique problems that require significant development of methods and techniques.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standardmore » analyses to unique problems that require significant development of methods and techniques.« less

  19. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  20. “RaMassays”: Synergistic Enhancement of Plasmon-Free Raman Scattering and Mass Spectrometry for Multimodal Analysis of Small Molecules

    NASA Astrophysics Data System (ADS)

    Alessandri, Ivano; Vassalini, Irene; Bertuzzi, Michela; Bontempi, Nicolò; Memo, Maurizio; Gianoncelli, Alessandra

    2016-10-01

    SiO2/TiO2 core/shell (T-rex) beads were exploited as “all-in-one” building-block materials to create analytical assays that combine plasmon-free surface enhanced Raman scattering (SERS) and surface assisted laser desorption/ionization (SALDI) mass spectrometry (RaMassays). Such a multi-modal approach relies on the unique optical properties of T-rex beads, which are able to harvest and manage light in both UV and Vis range, making ionization and Raman scattering more efficient. RaMassays were successfully applied to the detection of small (molecular weight, M.W. <400 Da) molecules with a key relevance in biochemistry and pharmaceutical analysis. Caffeine and cocaine were utilized as molecular probes to test the combined SERS/SALDI response of RaMassays, showing excellent sensitivity and reproducibility. The differentiation between amphetamine/ephedrine and theophylline/theobromine couples demonstrated the synergistic reciprocal reinforcement of SERS and SALDI. Finally, the conversion of L-tyrosine in L-DOPA was utilized to probe RaMassays as analytical tools for characterizing reaction intermediates without introducing any spurious effects. RaMassays exhibit important advantages over plasmonic nanoparticles in terms of reproducibility, absence of interference and potential integration in multiplexed devices.

  1. Metabolite profiling of a NIST Standard Reference Material for human plasma (SRM 1950): GC-MS, LC-MS, NMR, and clinical laboratory analyses, libraries, and web-based resources.

    PubMed

    Simón-Manso, Yamil; Lowenthal, Mark S; Kilpatrick, Lisa E; Sampson, Maureen L; Telu, Kelly H; Rudnick, Paul A; Mallard, W Gary; Bearden, Daniel W; Schock, Tracey B; Tchekhovskoi, Dmitrii V; Blonder, Niksa; Yan, Xinjian; Liang, Yuxue; Zheng, Yufang; Wallace, William E; Neta, Pedatsur; Phinney, Karen W; Remaley, Alan T; Stein, Stephen E

    2013-12-17

    Recent progress in metabolomics and the development of increasingly sensitive analytical techniques have renewed interest in global profiling, i.e., semiquantitative monitoring of all chemical constituents of biological fluids. In this work, we have performed global profiling of NIST SRM 1950, "Metabolites in Human Plasma", using GC-MS, LC-MS, and NMR. Metabolome coverage, difficulties, and reproducibility of the experiments on each platform are discussed. A total of 353 metabolites have been identified in this material. GC-MS provides 65 unique identifications, and most of the identifications from NMR overlap with the LC-MS identifications, except for some small sugars that are not directly found by LC-MS. Also, repeatability and intermediate precision analyses show that the SRM 1950 profiling is reproducible enough to consider this material as a good choice to distinguish between analytical and biological variability. Clinical laboratory data shows that most results are within the reference ranges for each assay. In-house computational tools have been developed or modified for MS data processing and interactive web display. All data and programs are freely available online at http://peptide.nist.gov/ and http://srmd.nist.gov/ .

  2. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. The Role of Transport Phenomena in Whispering Gallery Mode Optical Biosensor Performance

    NASA Astrophysics Data System (ADS)

    Gamba, Jason

    Whispering gallery mode (WGM) optical resonator sensors have emerged as promising tools for label-free detection of biomolecules in solution. These devices have even demonstrated single-molecule limits of detection in complex biological uids. This extraordinary sensitivity makes them ideal for low-concentration analytical and diagnostic measurements, but a great deal of work must be done toward understanding and optimizing their performance before they are capable of reliable quantitative measurents. The present work explores the physical processes behind this extreme sensitivity and how to best take advantage of them for practical applications of this technology. I begin by examining the nature of the interaction between the intense electromagnetic elds that build up in the optical biosensor and the biomolecules that bind to its surface. This work addresses the need for a coherent and thorough physical model that can be used to predict sensor behavior for a range of experimental parameters. While this knowledge will prove critical for the development of this technology, it has also shone a light on nonlinear thermo-optical and optical phenomena that these devices are uniquely suited to probing. The surprisingly rapid transient response of toroidal WGM biosensors despite sub-femtomolar analyte concentrations is also addressed. The development of asymmetric boundary layers around these devices under ow is revealed to enhance the capture rate of proteins from solution compared to the spherical sensors used previously. These lessons will guide the design of ow systems to minimize measurement time and consumption of precious sample, a key factor in any medically relevant assay. Finally, experimental results suggesting that WGM biosensors could be used to improve the quantitative detection of small-molecule biomarkers in exhaled breath condensate demonstrate how their exceptional sensitivity and transient response can enable the use of this noninvasive method to probe respiratory distress. WGM bioensors are unlike any other analytical tool, and the work presented here focuses on answering engineering questions surrounding their performance and potential.

  4. Development of the Biology Card Sorting Task to Measure Conceptual Expertise in Biology

    PubMed Central

    Smith, Julia I.; Combs, Elijah D.; Nagami, Paul H.; Alto, Valerie M.; Goh, Henry G.; Gourdet, Muryam A. A.; Hough, Christina M.; Nickell, Ashley E.; Peer, Adrian G.; Coley, John D.; Tanner, Kimberly D.

    2013-01-01

    There are widespread aspirations to focus undergraduate biology education on teaching students to think conceptually like biologists; however, there is a dearth of assessment tools designed to measure progress from novice to expert biological conceptual thinking. We present the development of a novel assessment tool, the Biology Card Sorting Task, designed to probe how individuals organize their conceptual knowledge of biology. While modeled on tasks from cognitive psychology, this task is unique in its design to test two hypothesized conceptual frameworks for the organization of biological knowledge: 1) a surface feature organization focused on organism type and 2) a deep feature organization focused on fundamental biological concepts. In this initial investigation of the Biology Card Sorting Task, each of six analytical measures showed statistically significant differences when used to compare the card sorting results of putative biological experts (biology faculty) and novices (non–biology major undergraduates). Consistently, biology faculty appeared to sort based on hypothesized deep features, while non–biology majors appeared to sort based on either surface features or nonhypothesized organizational frameworks. Results suggest that this novel task is robust in distinguishing populations of biology experts and biology novices and may be an adaptable tool for tracking emerging biology conceptual expertise. PMID:24297290

  5. Investigation of the subcellular architecture of L7 neurons of Aplysia californica using magnetic resonance microscopy (MRM) at 7.8 microns.

    PubMed

    Lee, Choong H; Flint, Jeremy J; Hansen, Brian; Blackband, Stephen J

    2015-06-10

    Magnetic resonance microscopy (MRM) is a non-invasive diagnostic tool which is well-suited to directly resolve cellular structures in ex vivo and in vitro tissues without use of exogenous contrast agents. Recent advances in its capability to visualize mammalian cellular structure in intact tissues have reinvigorated analytical interest in aquatic cell models whose previous findings warrant up-to-date validation of subcellular components. Even if the sensitivity of MRM is less than other microscopic technologies, its strength lies in that it relies on the same image contrast mechanisms as clinical MRI which make it a unique tool for improving our ability to interpret human diagnostic imaging through high resolution studies of well-controlled biological model systems. Here, we investigate the subcellular MR signal characteristics of isolated cells of Aplysia californica at an in-plane resolution of 7.8 μm. In addition, direct correlation and positive identification of subcellular architecture in the cells is achieved through well-established histology. We hope this methodology will serve as the groundwork for studying pathophysiological changes through perturbation studies and allow for development of disease-specific cellular modeling tools. Such an approach promises to reveal the MR contrast changes underlying cellular mechanisms in various human diseases, for example in ischemic stroke.

  6. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  7. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  8. THE IMPORTANCE OF PROPER INTENSITY CALIBRATION FOR RAMAN ANALYSIS OF LOW-LEVEL ANALYTES IN WATER

    EPA Science Inventory

    Modern dispersive Raman spectroscopy offers unique advantages for the analysis of low-concentration analytes in aqueous solution. However, we have found that proper intensity calibration is critical for obtaining these benefits. This is true not only for producing spectra with ...

  9. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  10. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  11. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  12. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  13. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  14. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  15. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  16. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  17. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  18. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  19. Identification challenges for large space structures

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.

    1990-01-01

    The paper examines the on-orbit modal identification of large space structures, stressing the importance of planning and experience, in preparation for the Space Station Structural Characterization Experiment (SSSCE) for the Space Station Freedom. The necessary information to foresee and overcome practical difficulties is considered in connection with seven key factors, including test objectives, dynamic complexity of the structure, data quality, extent of exploratory studies, availability and understanding of software tools, experience with similar problems, and pretest analytical conditions. These factors affect identification success in ground tests. Comparisons with similar ground tests of assembled systems are discussed, showing that the constraints of space tests make these factors more significant. The absence of data and experiences relating to on-orbit modal identification testing is shown to make identification a uniquely mathematical problem, although all spacecraft are constructed and verified by proven engineering methods.

  20. Nanomaterials as analytical tools for genosensors.

    PubMed

    Abu-Salah, Khalid M; Alrokyan, Salman A; Khan, Muhammad Naziruddin; Ansari, Anees Ahmad

    2010-01-01

    Nanomaterials are being increasingly used for the development of electrochemical DNA biosensors, due to the unique electrocatalytic properties found in nanoscale materials. They offer excellent prospects for interfacing biological recognition events with electronic signal transduction and for designing a new generation of bioelectronic devices exhibiting novel functions. In particular, nanomaterials such as noble metal nanoparticles (Au, Pt), carbon nanotubes (CNTs), magnetic nanoparticles, quantum dots and metal oxide nanoparticles have been actively investigated for their applications in DNA biosensors, which have become a new interdisciplinary frontier between biological detection and material science. In this article, we address some of the main advances in this field over the past few years, discussing the issues and challenges with the aim of stimulating a broader interest in developing nanomaterial-based biosensors and improving their applications in disease diagnosis and food safety examination.

  1. Nanomaterials as Analytical Tools for Genosensors

    PubMed Central

    Abu-Salah, Khalid M.; Alrokyan, Salman A.; Khan, Muhammad Naziruddin; Ansari, Anees Ahmad

    2010-01-01

    Nanomaterials are being increasingly used for the development of electrochemical DNA biosensors, due to the unique electrocatalytic properties found in nanoscale materials. They offer excellent prospects for interfacing biological recognition events with electronic signal transduction and for designing a new generation of bioelectronic devices exhibiting novel functions. In particular, nanomaterials such as noble metal nanoparticles (Au, Pt), carbon nanotubes (CNTs), magnetic nanoparticles, quantum dots and metal oxide nanoparticles have been actively investigated for their applications in DNA biosensors, which have become a new interdisciplinary frontier between biological detection and material science. In this article, we address some of the main advances in this field over the past few years, discussing the issues and challenges with the aim of stimulating a broader interest in developing nanomaterial-based biosensors and improving their applications in disease diagnosis and food safety examination. PMID:22315580

  2. Characterization of interfacial socket pressure in transhumeral prostheses: A case series.

    PubMed

    Schofield, Jonathon S; Schoepp, Katherine R; Williams, Heather E; Carey, Jason P; Marasco, Paul D; Hebert, Jacqueline S

    2017-01-01

    One of the most important factors in successful upper limb prostheses is the socket design. Sockets must be individually fabricated to arrive at a geometry that suits the user's morphology and appropriately distributes the pressures associated with prosthetic use across the residual limb. In higher levels of amputation, such as transhumeral, this challenge is amplified as prosthetic weight and the physical demands placed on the residual limb are heightened. Yet, in the upper limb, socket fabrication is largely driven by heuristic practices. An analytical understanding of the interactions between the socket and residual limb is absent in literature. This work describes techniques, adapted from lower limb prosthetic research, to empirically characterize the pressure distribution occurring between the residual limb and well-fit transhumeral prosthetic sockets. A case series analyzing the result of four participants with transhumeral amputation is presented. A Tekscan VersaTek pressure measurement system and FaroArm Edge coordinate measurement machine were employed to capture socket-residual limb interface pressures and geometrically register these values to the anatomy of participants. Participants performed two static poses with their prosthesis under two separate loading conditions. Surface pressure maps were constructed from the data, highlighting pressure distribution patterns, anatomical locations bearing maximum pressure, and the relative pressure magnitudes. Pressure distribution patterns demonstrated unique characteristics across the four participants that could be traced to individual socket design considerations. This work presents a technique that implements commercially available tools to quantitatively characterize upper limb socket-residual limb interactions. This is a fundamental first step toward improved socket designs developed through informed, analytically-based design tools.

  3. Characterization of interfacial socket pressure in transhumeral prostheses: A case series

    PubMed Central

    Schoepp, Katherine R.; Williams, Heather E.; Carey, Jason P.; Marasco, Paul D.

    2017-01-01

    One of the most important factors in successful upper limb prostheses is the socket design. Sockets must be individually fabricated to arrive at a geometry that suits the user’s morphology and appropriately distributes the pressures associated with prosthetic use across the residual limb. In higher levels of amputation, such as transhumeral, this challenge is amplified as prosthetic weight and the physical demands placed on the residual limb are heightened. Yet, in the upper limb, socket fabrication is largely driven by heuristic practices. An analytical understanding of the interactions between the socket and residual limb is absent in literature. This work describes techniques, adapted from lower limb prosthetic research, to empirically characterize the pressure distribution occurring between the residual limb and well-fit transhumeral prosthetic sockets. A case series analyzing the result of four participants with transhumeral amputation is presented. A Tekscan VersaTek pressure measurement system and FaroArm Edge coordinate measurement machine were employed to capture socket-residual limb interface pressures and geometrically register these values to the anatomy of participants. Participants performed two static poses with their prosthesis under two separate loading conditions. Surface pressure maps were constructed from the data, highlighting pressure distribution patterns, anatomical locations bearing maximum pressure, and the relative pressure magnitudes. Pressure distribution patterns demonstrated unique characteristics across the four participants that could be traced to individual socket design considerations. This work presents a technique that implements commercially available tools to quantitatively characterize upper limb socket-residual limb interactions. This is a fundamental first step toward improved socket designs developed through informed, analytically-based design tools. PMID:28575012

  4. Utilization of machine learning for prediction of post-traumatic stress: a re-examination of cortisol in the prediction and pathways to non-remitting PTSD

    PubMed Central

    Galatzer-Levy, I R; Ma, S; Statnikov, A; Yehuda, R; Shalev, A Y

    2017-01-01

    To date, studies of biological risk factors have revealed inconsistent relationships with subsequent post-traumatic stress disorder (PTSD). The inconsistent signal may reflect the use of data analytic tools that are ill equipped for modeling the complex interactions between biological and environmental factors that underlay post-traumatic psychopathology. Further, using symptom-based diagnostic status as the group outcome overlooks the inherent heterogeneity of PTSD, potentially contributing to failures to replicate. To examine the potential yield of novel analytic tools, we reanalyzed data from a large longitudinal study of individuals identified following trauma in the general emergency room (ER) that failed to find a linear association between cortisol response to traumatic events and subsequent PTSD. First, latent growth mixture modeling empirically identified trajectories of post-traumatic symptoms, which then were used as the study outcome. Next, support vector machines with feature selection identified sets of features with stable predictive accuracy and built robust classifiers of trajectory membership (area under the receiver operator characteristic curve (AUC)=0.82 (95% confidence interval (CI)=0.80–0.85)) that combined clinical, neuroendocrine, psychophysiological and demographic information. Finally, graph induction algorithms revealed a unique path from childhood trauma via lower cortisol during ER admission, to non-remitting PTSD. Traditional general linear modeling methods then confirmed the newly revealed association, thereby delineating a specific target population for early endocrine interventions. Advanced computational approaches offer innovative ways for uncovering clinically significant, non-shared biological signals in heterogeneous samples. PMID:28323285

  5. Utilization of machine learning for prediction of post-traumatic stress: a re-examination of cortisol in the prediction and pathways to non-remitting PTSD.

    PubMed

    Galatzer-Levy, I R; Ma, S; Statnikov, A; Yehuda, R; Shalev, A Y

    2017-03-21

    To date, studies of biological risk factors have revealed inconsistent relationships with subsequent post-traumatic stress disorder (PTSD). The inconsistent signal may reflect the use of data analytic tools that are ill equipped for modeling the complex interactions between biological and environmental factors that underlay post-traumatic psychopathology. Further, using symptom-based diagnostic status as the group outcome overlooks the inherent heterogeneity of PTSD, potentially contributing to failures to replicate. To examine the potential yield of novel analytic tools, we reanalyzed data from a large longitudinal study of individuals identified following trauma in the general emergency room (ER) that failed to find a linear association between cortisol response to traumatic events and subsequent PTSD. First, latent growth mixture modeling empirically identified trajectories of post-traumatic symptoms, which then were used as the study outcome. Next, support vector machines with feature selection identified sets of features with stable predictive accuracy and built robust classifiers of trajectory membership (area under the receiver operator characteristic curve (AUC)=0.82 (95% confidence interval (CI)=0.80-0.85)) that combined clinical, neuroendocrine, psychophysiological and demographic information. Finally, graph induction algorithms revealed a unique path from childhood trauma via lower cortisol during ER admission, to non-remitting PTSD. Traditional general linear modeling methods then confirmed the newly revealed association, thereby delineating a specific target population for early endocrine interventions. Advanced computational approaches offer innovative ways for uncovering clinically significant, non-shared biological signals in heterogeneous samples.

  6. Using a patterned grating structure to create lipid bilayer platforms insensitive to air bubbles.

    PubMed

    Han, Chung-Ta; Chao, Ling

    2015-01-07

    Supported lipid bilayers (SLBs) have been used for various biosensing applications. The bilayer structure enables embedded lipid membrane species to maintain their native orientation, and the two-dimensional fluidity is crucial for numerous biomolecular interactions to occur. The platform integrated with a microfluidic device for reagent transport and exchange has great potential to be applied with surface analytical tools. However, SLBs can easily be destroyed by air bubbles during assay reagent transport and exchange. Here, we created a patterned obstacle grating structured surface in a microfluidic channel to protect SLBs from being destroyed by air bubbles. Unlike all of the previous approaches using chemical modification or adding protection layers to strengthen lipid bilayers, the uniqueness of this approach is that it uses the patterned obstacles to physically trap water above the bilayers to prevent the air-water interface from directly coming into contact with and peeling the bilayers. We showed that our platform with certain grating geometry criteria can provide promising protection to SLBs from air bubbles. The required obstacle distance was found to decrease when we increased the air-bubble movement speed. In addition, the interaction assay results from streptavidin and biotinylated lipids in the confined SLBs suggested that receptors at the SLBs retained the interaction ability after air-bubble treatment. The results showed that the developed SLB platform can preserve both high membrane fluidity and high accessibility to the outside environment, which have never been simultaneously achieved before. Incorporating the built platforms with some surface analytical tools could open the bottleneck of building highly robust in vitro cell-membrane-related bioassays.

  7. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  8. Agricultural Trade Networks and Patterns of Economic Development

    PubMed Central

    Shutters, Shade T.; Muneepeerakul, Rachata

    2012-01-01

    International trade networks are manifestations of a complex combination of diverse underlying factors, both natural and social. Here we apply social network analytics to the international trade network of agricultural products to better understand the nature of this network and its relation to patterns of international development. Using a network tool known as triadic analysis we develop triad significance profiles for a series of agricultural commodities traded among countries. Results reveal a novel network “superfamily” combining properties of biological information processing networks and human social networks. To better understand this unique network signature, we examine in more detail the degree and triadic distributions within the trade network by country and commodity. Our results show that countries fall into two very distinct classes based on their triadic frequencies. Roughly 165 countries fall into one class while 18, all highly isolated with respect to international agricultural trade, fall into the other. Only Vietnam stands out as a unique case. Finally, we show that as a country becomes less isolated with respect to number of trading partners, the country's triadic signature follows a predictable trajectory that may correspond to a trajectory of development. PMID:22768310

  9. HIV cure research community engagement in North Carolina: a mixed-methods evaluation of a crowdsourcing contest.

    PubMed

    Mathews, Allison; Farley, Samantha; Blumberg, Meredith; Knight, Kimberley; Hightow-Weidman, Lisa; Muessig, Kate; Rennie, Stuart; Tucker, Joseph

    2017-10-01

    The purpose of this study was to evaluate the feasibility of using a crowdsourcing contest to promote HIV cure research community engagement. Crowdsourcing contests are open calls for community participation to achieve a task, in this case to engage local communities about HIV cure research. Our contest solicited images and videos of what HIV cure meant to people. Contestants submitted entries to IdeaScale, an encrypted online contest platform. We used a mixed-methods study design to evaluate the contest. Engagement was assessed through attendance at promotional events and social media user analytics. Google Analytics measured contest website user-engagement statistics. Text from contest video entries was transcribed, coded and analysed using MAXQDA. There were 144 attendees at three promotional events and 32 entries from 39 contestants. Most individuals who submitted entries were black ( n =31), had some college education ( n =18) and were aged 18-23 years ( n =23). Social media analytics showed 684 unique page followers, 2233 unique page visits, 585 unique video views and an overall reach of 80,624 unique users. Contest submissions covered themes related to the community's role in shaping the future of HIV cure through education, social justice, creativity and stigma reduction. Crowdsourcing contests are feasible for engaging community members in HIV cure research. Community contributions to crowdsourcing contests provide useful content for culturally relevant and locally responsive research engagement.

  10. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  12. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  13. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  14. Topology based data analysis identifies a subgroup of breast cancers with a unique mutational profile and excellent survival.

    PubMed

    Nicolau, Monica; Levine, Arnold J; Carlsson, Gunnar

    2011-04-26

    High-throughput biological data, whether generated as sequencing, transcriptional microarrays, proteomic, or other means, continues to require analytic methods that address its high dimensional aspects. Because the computational part of data analysis ultimately identifies shape characteristics in the organization of data sets, the mathematics of shape recognition in high dimensions continues to be a crucial part of data analysis. This article introduces a method that extracts information from high-throughput microarray data and, by using topology, provides greater depth of information than current analytic techniques. The method, termed Progression Analysis of Disease (PAD), first identifies robust aspects of cluster analysis, then goes deeper to find a multitude of biologically meaningful shape characteristics in these data. Additionally, because PAD incorporates a visualization tool, it provides a simple picture or graph that can be used to further explore these data. Although PAD can be applied to a wide range of high-throughput data types, it is used here as an example to analyze breast cancer transcriptional data. This identified a unique subgroup of Estrogen Receptor-positive (ER(+)) breast cancers that express high levels of c-MYB and low levels of innate inflammatory genes. These patients exhibit 100% survival and no metastasis. No supervised step beyond distinction between tumor and healthy patients was used to identify this subtype. The group has a clear and distinct, statistically significant molecular signature, it highlights coherent biology but is invisible to cluster methods, and does not fit into the accepted classification of Luminal A/B, Normal-like subtypes of ER(+) breast cancers. We denote the group as c-MYB(+) breast cancer.

  15. Lipidomic analysis for carbonyl species derived from fish oil using liquid chromatography-tandem mass spectrometry.

    PubMed

    Suh, Joon Hyuk; Niu, Yue S; Hung, Wei-Lun; Ho, Chi-Tang; Wang, Yu

    2017-06-01

    Lipid peroxidation gives rise to carbonyl species, some of which are reactive and play a role in the pathogenesis of numerous human diseases. Oils are ubiquitous sources that can be easily oxidized to generate these compounds under oxidative stress. In this present work, we developed a targeted lipidomic method for the simultaneous determination of thirty-five aldehydes and ketones derived from fish oil, the omega-3 fatty acid-rich source, by using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The analytes include highly toxic reactive carbonyl species (RCS) such as acrolein, crotonaldehyde, trans-4-hydroxy-2-hexenal (HHE), trans-4-hydroxy-2-nonenal (HNE), trans-4-oxo-2-nonenal (ONE), glyoxal and methylglyoxal, all of which are promising biomarkers of lipid peroxidation. They were formed using in vitro Fe(II)-mediated oxidation, and derivatized using 2,4-dinitrophenylhydrazine (DNPH) for the feasibility of quantitative assay. Before analysis, solid phase extraction (SPE) was used to clean samples further. Uniquely different patterns of carbonyl compound generation between omega-3 and 6 fatty acids were observed using this lipidomic approach. The method developed was both validated, and successfully applied to monitor formation of carbonyl species by lipid peroxidation using ten different fish oil products. Hypotheses of correlations between the monitored dataset of analytes and their parent fatty acids were also tested using the Pearson's correlation test. Results indicate our method is a useful analytical tool for lipid peroxidation studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. VVER-440 and VVER-1000 reactor dosimetry benchmark - BUGLE-96 versus ALPAN VII.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duo, J. I.

    2011-07-01

    Document available in abstract form only, full text of document follows: Analytical results of the vodo-vodyanoi energetichesky reactor-(VVER-) 440 and VVER-1000 reactor dosimetry benchmarks developed from engineering mockups at the Nuclear Research Inst. Rez LR-0 reactor are discussed. These benchmarks provide accurate determination of radiation field parameters in the vicinity and over the thickness of the reactor pressure vessel. Measurements are compared to calculated results with two sets of tools: TORT discrete ordinates code and BUGLE-96 cross-section library versus the newly Westinghouse-developed RAPTOR-M3G and ALPAN VII.0. The parallel code RAPTOR-M3G enables detailed neutron distributions in energy and space in reducedmore » computational time. ALPAN VII.0 cross-section library is based on ENDF/B-VII.0 and is designed for reactor dosimetry applications. It uses a unique broad group structure to enhance resolution in thermal-neutron-energy range compared to other analogous libraries. The comparison of fast neutron (E > 0.5 MeV) results shows good agreement (within 10%) between BUGLE-96 and ALPAN VII.O libraries. Furthermore, the results compare well with analogous results of participants of the REDOS program (2005). Finally, the analytical results for fast neutrons agree within 15% with the measurements, for most locations in all three mockups. In general, however, the analytical results underestimate the attenuation through the reactor pressure vessel thickness compared to the measurements. (authors)« less

  17. Ion/Neutral, Ion/Electron, Ion/Photon, and Ion/Ion Interactions in Tandem Mass Spectrometry: Do we need them all? Are they enough?

    PubMed Central

    McLuckey, Scott A.; Mentinova, Marija

    2011-01-01

    A range of strategies and tools has been developed to facilitate the determination of primary structures of analyte molecules of interest via tandem mass spectrometry (MS/MS). The two main factors that determine the primary structural information present in an MS/MS spectrum are the type of ion generated from the analyte molecule and the dissociation method. The ion-type subjected to dissociation is determined by the ionization method/conditions and ion transformation processes that might take place after initial gas-phase ion formation. Furthermore, the range of analyte-related ion types can be expanded via derivatization reactions prior to mass spectrometry. Dissociation methods include those that simply alter the population of internal states of the mass-selected ion (i.e., activation methods like collision-induced dissociation) as well as processes that rely on transformation of the ion-type prior to dissociation (e.g., electron capture dissociation). A variety of ionic interactions has been studied for the purpose of ion dissociation and ion transformation that include ion/neutral, ion/photon, ion/electron, and ion/ion interactions. A wide range of phenomena has been observed, many of which have been explored/developed as means for structural analysis. The techniques arising from these phenomena are discussed within the context of the elements of structure determination in tandem mass spectrometry, viz., ion-type definition and dissociation. Unique aspects of the various ion interactions are emphasized along with any barriers to widespread implementation. PMID:21472539

  18. Foraminiferal Stable Isotope Geochemistry At The Micrometer Scale: Is It A Dream Or Reality?

    NASA Astrophysics Data System (ADS)

    Misra, S.; Shuttleworth, S.; Lloyd, N. S.; Sadekov, A.; Elderfield, H.

    2012-12-01

    Over last few decades trace metals and stable isotope compositions of foraminiferal shells became one of the major tools to study past oceans and associated climate change. Empirical calibrations of δ11B, δ18O, Mg/Ca, Cd/Ca, Ba/Ca shells compositions have linked them to various environmental parameters such as seawater pH, temperature, salinity and productivity. Despite their common use as proxies, little is known about mechanisms of trace metals incorporation into foraminiferal calcite. Trace metals partition coefficients for foraminiferal calcite is significantly different from inorganic calcite precipitates underlining strong biological control on metal transport to the calcification sites and their incorporation into the calcite. Microscale distribution of light elements isotopes (e.g. Li, B, Mg) could potentially provide unique inside into these biomineralization processes improving our understanding of foraminiferal geochemistry. In this work we explore potentials of using recent advances in analytical geochemistry by employing laser ablation and multi-collector ICP-MS to study microscale distribution of Mg isotopes across individual foraminiferal shells and δ11B, and δ7Li analyses of individual shell chambers. The analytical setup includes an Analyte.G2 193nm excimer laser ablation system with two volume ablation cell connected to a Thermo Scientific NEPTUNE Plus MC-ICP-MS with Jet Interface option. We will discuss method limitations and advantages for foraminiferal geochemistry as well as our data on Mg isotopes distribution within shells of planktonic foraminifera.

  19. Modification and validation of an analytical source model for external beam radiotherapy Monte Carlo dose calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, Scott E., E-mail: sedavids@utmb.edu

    Purpose: A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who usesmore » these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today’s modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. Methods: The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Results: Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data points tested. The model was capable of predicting the depth of the maximum dose within 1 mm. Anthropomorphic phantom benchmark testing of modulated and patterned MLCs treatment plans showed agreement to measurement within 3% in target regions using thermoluminescent dosimeters (TLD). Using radiochromic film normalized to TLD, a gamma criteria of 3% of maximum dose and 2 mm DTA was applied with a pass rate of least 85% in the high dose, high gradient, and low dose regions. Finally, recalculations of patient plans using DPM showed good agreement relative to a commercial TPS when comparing dose volume histograms and 2D dose distributions. Conclusions: A unique analytical source model coupled to the dose planning method Monte Carlo dose calculation code has been modified and validated using basic beam data and anthropomorphic phantom measurement. While this tool can be applied in general use for a particular linac model, specifically it was developed to provide a singular methodology to independently assess treatment plan dose distributions from those clinical institutions participating in National Cancer Institute trials.« less

  20. IBM's Health Analytics and Clinical Decision Support.

    PubMed

    Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W

    2014-08-15

    This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.

  1. CancerLectinDB: a database of lectins relevant to cancer.

    PubMed

    Damodaran, Deepa; Jeyakani, Justin; Chauhan, Alok; Kumar, Nirmal; Chandra, Nagasuma R; Surolia, Avadhesha

    2008-04-01

    The role of lectins in mediating cancer metastasis, apoptosis as well as various other signaling events has been well established in the past few years. Data on various aspects of the role of lectins in cancer is being accumulated at a rapid pace. The data on lectins available in the literature is so diverse, that it becomes difficult and time-consuming, if not impossible to comprehend the advances in various areas and obtain the maximum benefit. Not only do the lectins vary significantly in their individual functional roles, but they are also diverse in their sequences, structures, binding site architectures, quaternary structures, carbohydrate affinities and specificities as well as their potential applications. An organization of these seemingly independent data into a common framework is essential in order to achieve effective use of all the data towards understanding the roles of different lectins in different aspects of cancer and any resulting applications. An integrated knowledge base (CancerLectinDB) together with appropriate analytical tools has therefore been developed for lectins relevant for any aspect of cancer, by collating and integrating diverse data. This database is unique in terms of providing sequence, structural, and functional annotations for lectins from all known sources in cancer and is expected to be a useful addition to the number of glycan related resources now available to the community. The database has been implemented using MySQL on a Linux platform and web-enabled using Perl-CGI and Java tools. Data for individual lectins pertain to taxonomic, biochemical, domain architecture, molecular sequence and structural details as well as carbohydrate specificities. Extensive links have also been provided for relevant bioinformatics resources and analytical tools. Availability of diverse data integrated into a common framework is expected to be of high value for various studies on lectin cancer biology. CancerLectinDB can be accessed through http://proline.physics.iisc.ernet.in/cancerdb .

  2. Voice-enabled Knowledge Engine using Flood Ontology and Natural Language Processing

    NASA Astrophysics Data System (ADS)

    Sermet, M. Y.; Demir, I.; Krajewski, W. F.

    2015-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts, flood-related data, information and interactive visualizations for communities in Iowa. The IFIS is designed for use by general public, often people with no domain knowledge and limited general science background. To improve effective communication with such audience, we have introduced a voice-enabled knowledge engine on flood related issues in IFIS. Instead of navigating within many features and interfaces of the information system and web-based sources, the system provides dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to real-time stream gauges, in-house data sources, analysis and visualization tools to answer natural language questions. Our goal is the systematization of data and modeling results on flood related issues in Iowa, and to provide an interface for definitive answers to factual queries. The goal of the knowledge engine is to make all flood related knowledge in Iowa easily accessible to everyone, and support voice-enabled natural language input. We aim to integrate and curate all flood related data, implement analytical and visualization tools, and make it possible to compute answers from questions. The IFIS explicitly implements analytical methods and models, as algorithms, and curates all flood related data and resources so that all these resources are computable. The IFIS Knowledge Engine computes the answer by deriving it from its computational knowledge base. The knowledge engine processes the statement, access data warehouse, run complex database queries on the server-side and return outputs in various formats. This presentation provides an overview of IFIS Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources. IFIS Knowledge Engine provides an alternative access method to these comprehensive set of tools and data resources available in IFIS. Current implementation of the system accepts free-form input and voice recognition capabilities within browser and mobile applications.

  3. The Impact of Proactive Student-Success Coaching Using Predictive Analytics on Community College Students

    ERIC Educational Resources Information Center

    Hall, Mark Monroe

    2017-01-01

    The purpose of this study was to examine the effects of proactive student-success coaching, informed by predictive analytics, on student academic performance and persistence. Specifically, semester GPA and semester-to-semester student persistence were the investigated outcomes. Uniquely, the community college focused the intervention on only…

  4. Ionic liquids in solid-phase microextraction: a review.

    PubMed

    Ho, Tien D; Canestraro, Anthony J; Anderson, Jared L

    2011-06-10

    Solid-phase microextraction (SPME) has undergone a surge in popularity within the field of analytical chemistry in the past two decades since its introduction. Owing to its nature of extraction, SPME has become widely known as a quick and cost-effective sample preparation technique. Although SPME has demonstrated extraordinary versatility in sampling capabilities, the technique continues to experience a tremendous growth in innovation. Presently, increasing efforts have been directed towards the engineering of novel sorbent material in order to expand the applicability of SPME for a wider range of analytes and matrices. This review highlights the application of ionic liquids (ILs) and polymeric ionic liquids (PILs) as innovative sorbent materials for SPME. Characterized by their unique physico-chemical properties, these compounds can be structurally-designed to selectively extract target analytes based on unique molecular interactions. To examine the advantages of IL and PIL-based sorbent coatings in SPME, the field is reviewed by gathering available experimental data and exploring the sensitivity, linear calibration range, as well as detection limits for a variety of target analytes in the methods that have been developed. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  6. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  7. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  8. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  9. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  10. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  11. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  12. The Metaphorical Department Head: Using Metaphors as Analytic Tools to Investigate the Role of Department Head

    ERIC Educational Resources Information Center

    Paranosic, Nikola; Riveros, Augusto

    2017-01-01

    This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…

  13. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  14. IGSN at Work in the Land Down Under: Exploiting an International Sample Identifier System to Enhance Reproducibility of Australian Geochemcial and Geochronological Data.

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Klump, J. F.; McInnes, B.; Wyborn, L. A.; Brown, A.

    2015-12-01

    The International Geo-Sample Number (IGSN) provides a globally unique identifier for physical samples used to generate analytical data. This unique identifier provides the ability to link each physical sample to any analytical data undertaken on that sample, as well as to any publications derived from any data derived on the sample. IGSN is particularly important for geochemical and geochronological data, where numerous analytical techniques can be undertaken at multiple analytical facilities not only on the parent rock sample itself, but also on derived sample splits and mineral separates. Australia now has three agencies implementing IGSN: Geoscience Australia, CSIRO and Curtin University. All three have now combined into a single project, funded by the Australian Research Data Services program, to better coordinate the implementation of IGSN in Australia, in particular how these agencies allocate IGSN identifiers. The project will register samples from pilot applications in each agency including the CSIRO National Collection of Mineral Spectra database, the Geoscience Australia sample collection, and the Digital Mineral Library of the John De Laeter Centre for Isotope Research at Curtin University. These local agency catalogues will then be aggregated into an Australian portal, which will ultimately be expanded for all geoscience specimens. The development of this portal will also involve developing a common core metadata schema for the description of Australian geoscience specimens, as well as formulating agreed governance models for registering Australian samples. These developments aim to enable a common approach across Australian academic, research organisations and government agencies for the unique identification of geoscience specimens and any analytical data and/or publications derived from them. The emerging pattern of governance and technical collaboration established in Australia may also serve as a blueprint for similar collaborations internationally.

  15. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  16. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handlesmore » a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.« less

  18. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  19. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  20. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  1. Origin of Analyte-Induced Porous Silicon Photoluminescence Quenching.

    PubMed

    Reynard, Justin M; Van Gorder, Nathan S; Bright, Frank V

    2017-09-01

    We report on gaseous analyte-induced photoluminescence (PL) quenching of porous silicon, as-prepared (ap-pSi) and oxidized (ox-pSi). By using steady-state and emission wavelength-dependent time-resolved intensity luminescence measurements in concert with a global analysis scheme, we find that the analyte-induced quenching is best described by a three-component static quenching model. In the model, there are blue, green, and red emitters (associated with the nanocrystallite core and surface trap states) that each exhibit unique analyte-emitter association constants and these association constants are a consequence of differences in the pSi surface chemistries.

  2. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  3. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  4. Unique Sensor Plane Maps Invisible Toxins for First Responders

    ScienceCinema

    Kroutil, Robert; Thomas, Mark; Aten, Keith

    2018-05-30

    A unique airborne emergency response tool, ASPECT is a Los Alamos/U.S. Environmental Protection Agency project that can put chemical and radiological mapping tools in the air over an accident scene. The name ASPECT is an acronym for Airborne Spectral Photometric Environmental Collection Technology.

  5. Learner Dashboards a Double-Edged Sword? Students' Sense-Making of a Collaborative Critical Reading and Learning Analytics Environment for Fostering 21st-Century Literacies

    ERIC Educational Resources Information Center

    Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon

    2017-01-01

    The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…

  6. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  7. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  8. Time as a Key Topic in Health Professionals’ Perceptions of Clinical Handovers

    PubMed Central

    Watson, Bernadette M.; Jones, Liz; Cretchley, Julia

    2014-01-01

    Clinical handovers are an essential part of the daily care and treatment of hospital patients. We invoked a language and social psychology lens to investigate how different health professional groups discussed the communication problems and strengths they experienced in handovers. We conducted in-depth interviews with three different health professional groups within a large metropolitan hospital. We used Leximancer text analytics software as a tool to analyze the data. Results showed that time was of concern to all groups in both similar and diverse ways. All professionals discussed time management, time pressures, and the difficulties of coordinating different handovers. Each professional group had its own unique perceptions and priorities about handovers. Our findings indicated that health professionals understood what was required for handover improvement but did not have the extra capacity to alter their current environment. Hospital management, with clinicians, need to implement handover schedule processes that prioritize interprofessional representation. PMID:28462291

  9. Investigation of burn effect on skin using simultaneous Raman-Brillouin spectroscopy, and fluorescence microspectroscopy

    NASA Astrophysics Data System (ADS)

    Coker, Zachary; Meng, Zhaokai; Troyanova-Wood, Maria; Traverso, Andrew; Ballmann, Charles; Petrov, Georgi; Ibey, Bennett L.; Yakovlev, Vladislav

    2017-02-01

    Burns are thermal injuries that can completely damage or at least compromise the protective function of skin, and affect the ability of tissues to manage moisture. Burn-damaged tissues exhibit lower elasticity than healthy tissues, due to significantly reduced water concentrations and plasma retention. Current methods for determining burn intensity are limited to visual inspection, and potential hospital x-ray examination. We present a unique confocal microscope capable of measuring Raman and Brillouin spectra simultaneously, with concurrent fluorescence investigation from a single spatial location, and demonstrate application by investigating and characterizing the properties of burn-afflicted tissue on chicken skin model. Raman and Brillouin scattering offer complementary information about a material's chemical and mechanical structure, while fluorescence can serve as a useful diagnostic indicator and imaging tool. The developed instrument has the potential for very diverse analytical applications in basic biomedical science and biomedical diagnostics and imaging.

  10. Box truss analysis and technology development. Task 1: Mesh analysis and control

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.

    1985-01-01

    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.

  11. One health-one medicine: unifying human and animal medicine within an evolutionary paradigm.

    PubMed

    Currier, Russell W; Steele, James H

    2011-08-01

    One health is a concept since early civilization, which promoted the view that there was no major distinction between animal and human medicine. Although persisting through the 19th century, this common vision was then all but forgotten in the early 20th century. It is now experiencing a renaissance, coincident with an awakening of the role that evolutionary biology plays in human and animal health, including sexually transmitted infections (STIs). A number of STIs in humans have comparable infections in animals; likewise, both humans and animals have STIs unique to each mammalian camp. These similarities and differences offer opportunities for basic medical and public health studies, including evolutionary insights that can be gleaned from ongoing interdisciplinary investigation--especially with the molecular analytical tools available--in what can become a golden age of mutually helpful discovery. © 2011 New York Academy of Sciences.

  12. Distributed representations in memory: Insights from functional brain imaging

    PubMed Central

    Rissman, Jesse; Wagner, Anthony D.

    2015-01-01

    Forging new memories for facts and events, holding critical details in mind on a moment-to-moment basis, and retrieving knowledge in the service of current goals all depend on a complex interplay between neural ensembles throughout the brain. Over the past decade, researchers have increasingly leveraged powerful analytical tools (e.g., multi-voxel pattern analysis) to decode the information represented within distributed fMRI activity patterns. In this review, we discuss how these methods can sensitively index neural representations of perceptual and semantic content, and how leverage on the engagement of distributed representations provides unique insights into distinct aspects of memory-guided behavior. We emphasize that, in addition to characterizing the contents of memories, analyses of distributed patterns shed light on the processes that influence how information is encoded, maintained, or retrieved, and thus inform memory theory. We conclude by highlighting open questions about memory that can be addressed through distributed pattern analyses. PMID:21943171

  13. Symmetry breaking and the geometry of reduced density matrices

    NASA Astrophysics Data System (ADS)

    Zauner, V.; Draxler, D.; Vanderstraeten, L.; Haegeman, J.; Verstraete, F.

    2016-11-01

    The concept of symmetry breaking and the emergence of corresponding local order parameters constitute the pillars of modern day many body physics. We demonstrate that the existence of symmetry breaking is a consequence of the geometric structure of the convex set of reduced density matrices of all possible many body wavefunctions. The surfaces of these convex bodies exhibit non-analyticities, which signal the emergence of symmetry breaking and of an associated order parameter and also show different characteristics for different types of phase transitions. We illustrate this with three paradigmatic examples of many body systems exhibiting symmetry breaking: the quantum Ising model, the classical q-state Potts model in two-dimensions at finite temperature and the ideal Bose gas in three-dimensions at finite temperature. This state based viewpoint on phase transitions provides a unique novel tool for studying exotic many body phenomena in quantum and classical systems.

  14. Space Station Freedom Data Assessment Study

    NASA Technical Reports Server (NTRS)

    Johnson, Anngienetta R.; Deskevich, Joseph

    1990-01-01

    The SSF Data Assessment Study was initiated to identify payload and operations data requirements to be supported in the Space Station era. To initiate the study payload requirements from the projected SSF user community were obtained utilizing an electronic questionnaire. The results of the questionnaire were incorporated in a personal computer compatible database used for mission scheduling and end-to-end communications analyses. This paper discusses data flow paths and associated latencies, communications bottlenecks, resource needs versus availability, payload scheduling 'warning flags' and payload data loading requirements for each major milestone in the Space Station buildup sequence. This paper also presents the statistical and analytical assessments produced using the data base, an experiment scheduling program, and a Space Station unique end-to-end simulation model. The modeling concepts and simulation methodologies presented in this paper provide a foundation for forecasting communication requirements and identifying modeling tools to be used in the SSF Tactical Operations Planning (TOP) process.

  15. Classification of Ion Mobility Data Using the Neural Network Approach

    NASA Technical Reports Server (NTRS)

    Duong, T. A.; Kanik, I.

    2005-01-01

    Determination of atmospheric and surface elemental and molecular composition of various solar system bodies is essential to the development of a firm understanding of the origin and evolution of the solar system. Furthermore, such data is needed to address the intriguing question of whether or not life exists or once existed elsewhere in the Solar System. As such, these measurements are among the primary scientific goals of NASA s current and future planetary missions. In recent years, significant progress toward both miniaturization and field portability of in situ analytical separation and detection devices have been made with future planetary explorations in mind. However, despite all these advances, accurate in situ identification of atmospheric and surface compounds remains a big challenge. In response to that we are developing various hardware and software tools which would enable us to uniquely identify species of interest in a complex chemical environment.

  16. 3D-Printed Graphene/Polylactic Acid Electrodes Promise High Sensitivity in Electroanalysis.

    PubMed

    Manzanares Palenzuela, C Lorena; Novotný, Filip; Krupička, Petr; Sofer, Zdeněk; Pumera, Martin

    2018-05-01

    Additive manufacturing provides a unique tool for prototyping structures toward electrochemical sensing, due to its ability to produce highly versatile, tailored-shaped devices in a low-cost and fast way with minimized waste. Here we present 3D-printed graphene electrodes for electrochemical sensing. Ring- and disc-shaped electrodes were 3D-printed with a Fused Deposition Modeling printer and characterized using cyclic voltammetry and scanning electron microscopy. Different redox probes K 3 Fe(CN) 6 :K 4 Fe(CN) 6 , FeCl 3 , ascorbic acid, Ru(NH 3 ) 6 Cl 3 , and ferrocene monocarboxylic acid) were used to assess the electrochemical performance of these devices. Finally, the electrochemical detection of picric acid and ascorbic acid was carried out as proof-of-concept analytes for sensing applications. Such customizable platforms represent promising alternatives to conventional electrodes for a wide range of sensing applications.

  17. Investigation of Outbreaks Complicated by Universal Exposure

    PubMed Central

    Bousema, Teun; Oliver, Isabel

    2012-01-01

    Outbreaks in which most or all persons were exposed to the same suspected source of infection, so-called universal exposure, are common. They represent a challenge for public health specialists because conducting analytical studies in such investigations is complicated by the absence of a nonexposed group. We describe different strategies that can support investigations of outbreaks with universal exposure. The value of descriptive epidemiology, extensive environmental investigation, and the hypothesis-generation phase cannot be overemphasized. An exposure that seems universal may in fact not be universal when additional aspects of the exposure are taken into account. Each exposure has unique characteristics that may not be captured when investigators rely on the tools readily at hand, such as standard questionnaires. We therefore encourage field epidemiologists to be creative and consider the use of alternative data sources or original techniques in their investigations of outbreaks with universal exposure. PMID:23092616

  18. Proposed in situ secondary ion mass spectrometry on Mars.

    PubMed

    Inglebert, R L; Klossa, B; Lorin, J C; Thomas, R

    1995-01-01

    Secondary ion mass spectrometry is a powerful analytical tool, which has the potentiality, through molecular ion emission, of detecting minor phases, as well as the unique capability of directly measuring isotope abundances in mineral or organic phases without any prior physical, chemical or thermal processing. Applied to the in situ analysis of the Martian regolith, it can provide evidence of the presence of carbonates and, by inference (if carbonates constitute significant deposits), of past liquid water--a necessary condition for the development of life. In addition, oxygen isotopic composition of carbonates preserves a record of the temperature at which this phase precipitated and may therefore help decipher the past climatology of Mars. Detection of a carbon isotopic composition shift between carbonates and organic matter (on Earth, the result of a kinetic fractionation effect during photosynthesis) would provide a definite clue regarding the existence of a past biochemical activity on Mars.

  19. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.

  20. Molecular Imprinting Technology in Quartz Crystal Microbalance (QCM) Sensors

    PubMed Central

    Emir Diltemiz, Sibel; Keçili, Rüstem; Ersöz, Arzu; Say, Rıdvan

    2017-01-01

    Molecularly imprinted polymers (MIPs) as artificial antibodies have received considerable scientific attention in the past years in the field of (bio)sensors since they have unique features that distinguish them from natural antibodies such as robustness, multiple binding sites, low cost, facile preparation and high stability under extreme operation conditions (higher pH and temperature values, etc.). On the other hand, the Quartz Crystal Microbalance (QCM) is an analytical tool based on the measurement of small mass changes on the sensor surface. QCM sensors are practical and convenient monitoring tools because of their specificity, sensitivity, high accuracy, stability and reproducibility. QCM devices are highly suitable for converting the recognition process achieved using MIP-based memories into a sensor signal. Therefore, the combination of a QCM and MIPs as synthetic receptors enhances the sensitivity through MIP process-based multiplexed binding sites using size, 3D-shape and chemical function having molecular memories of the prepared sensor system toward the target compound to be detected. This review aims to highlight and summarize the recent progress and studies in the field of (bio)sensor systems based on QCMs combined with molecular imprinting technology. PMID:28245588

  1. Effect of Instructional Strategies and Individual Differences: A Meta-Analytic Assessment.

    ERIC Educational Resources Information Center

    Baker, Rose M.; Dwyer, Francis

    2005-01-01

    This meta-analytic study is unique and significant in that all the 1,341 learners in 11 studies completed the Group Embedded Figures Test (GEFT), interacted with the same instructional module, and completed the same five criterion tests measuring different types of educational objectives. Studies varied in presentation mode and type of independent…

  2. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  3. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  4. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  5. Space station structures and dynamics test program

    NASA Technical Reports Server (NTRS)

    Moore, Carleton J.; Townsend, John S.; Ivey, Edward W.

    1987-01-01

    The design, construction, and operation of a low-Earth orbit space station poses unique challenges for development and implementation of new technology. The technology arises from the special requirement that the station be built and constructed to function in a weightless environment, where static loads are minimal and secondary to system dynamics and control problems. One specific challenge confronting NASA is the development of a dynamics test program for: (1) defining space station design requirements, and (2) identifying the characterizing phenomena affecting the station's design and development. A general definition of the space station dynamic test program, as proposed by MSFC, forms the subject of this report. The test proposal is a comprehensive structural dynamics program to be launched in support of the space station. The test program will help to define the key issues and/or problems inherent to large space structure analysis, design, and testing. Development of a parametric data base and verification of the math models and analytical analysis tools necessary for engineering support of the station's design, construction, and operation provide the impetus for the dynamics test program. The philosophy is to integrate dynamics into the design phase through extensive ground testing and analytical ground simulations of generic systems, prototype elements, and subassemblies. On-orbit testing of the station will also be used to define its capability.

  6. Insights from advanced analytics at the Veterans Health Administration.

    PubMed

    Fihn, Stephan D; Francis, Joseph; Clancy, Carolyn; Nielson, Christopher; Nelson, Karin; Rumsfeld, John; Cullen, Theresa; Bates, Jack; Graham, Gail L

    2014-07-01

    Health care has lagged behind other industries in its use of advanced analytics. The Veterans Health Administration (VHA) has three decades of experience collecting data about the veterans it serves nationwide through locally developed information systems that use a common electronic health record. In 2006 the VHA began to build its Corporate Data Warehouse, a repository for patient-level data aggregated from across the VHA's national health system. This article provides a high-level overview of the VHA's evolution toward "big data," defined as the rapid evolution of applying advanced tools and approaches to large, complex, and rapidly changing data sets. It illustrates how advanced analysis is already supporting the VHA's activities, which range from routine clinical care of individual patients--for example, monitoring medication administration and predicting risk of adverse outcomes--to evaluating a systemwide initiative to bring the principles of the patient-centered medical home to all veterans. The article also shares some of the challenges, concerns, insights, and responses that have emerged along the way, such as the need to smoothly integrate new functions into clinical workflow. While the VHA is unique in many ways, its experience may offer important insights for other health care systems nationwide as they venture into the realm of big data. Project HOPE—The People-to-People Health Foundation, Inc.

  7. Space Shuttle Columbia Post-Accident Analysis and Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

  8. “RaMassays”: Synergistic Enhancement of Plasmon-Free Raman Scattering and Mass Spectrometry for Multimodal Analysis of Small Molecules

    PubMed Central

    Alessandri, Ivano; Vassalini, Irene; Bertuzzi, Michela; Bontempi, Nicolò; Memo, Maurizio; Gianoncelli, Alessandra

    2016-01-01

    SiO2/TiO2 core/shell (T-rex) beads were exploited as “all-in-one” building-block materials to create analytical assays that combine plasmon-free surface enhanced Raman scattering (SERS) and surface assisted laser desorption/ionization (SALDI) mass spectrometry (RaMassays). Such a multi-modal approach relies on the unique optical properties of T-rex beads, which are able to harvest and manage light in both UV and Vis range, making ionization and Raman scattering more efficient. RaMassays were successfully applied to the detection of small (molecular weight, M.W. <400 Da) molecules with a key relevance in biochemistry and pharmaceutical analysis. Caffeine and cocaine were utilized as molecular probes to test the combined SERS/SALDI response of RaMassays, showing excellent sensitivity and reproducibility. The differentiation between amphetamine/ephedrine and theophylline/theobromine couples demonstrated the synergistic reciprocal reinforcement of SERS and SALDI. Finally, the conversion of L-tyrosine in L-DOPA was utilized to probe RaMassays as analytical tools for characterizing reaction intermediates without introducing any spurious effects. RaMassays exhibit important advantages over plasmonic nanoparticles in terms of reproducibility, absence of interference and potential integration in multiplexed devices. PMID:27698368

  9. Safety risk assessment using analytic hierarchy process (AHP) during planning and budgeting of construction projects.

    PubMed

    Aminbakhsh, Saman; Gunduz, Murat; Sonmez, Rifat

    2013-09-01

    The inherent and unique risks on construction projects quite often present key challenges to contractors. Health and safety risks are among the most significant risks in construction projects since the construction industry is characterized by a relatively high injury and death rate compared to other industries. In construction project management, safety risk assessment is an important step toward identifying potential hazards and evaluating the risks associated with the hazards. Adequate prioritization of safety risks during risk assessment is crucial for planning, budgeting, and management of safety related risks. In this paper, a safety risk assessment framework is presented based on the theory of cost of safety (COS) model and the analytic hierarchy process (AHP). The main contribution of the proposed framework is that it presents a robust method for prioritization of safety risks in construction projects to create a rational budget and to set realistic goals without compromising safety. The framework provides a decision tool for the decision makers to determine the adequate accident/injury prevention investments while considering the funding limits. The proposed safety risk framework is illustrated using a real-life construction project and the advantages and limitations of the framework are discussed. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  10. The color of complexes and UV-vis spectroscopy as an analytical tool of Alfred Werner's group at the University of Zurich.

    PubMed

    Fox, Thomas; Berke, Heinz

    2014-01-01

    Two PhD theses (Alexander Gordienko, 1912; Johannes Angerstein, 1914) and a dissertation in partial fulfillment of a PhD thesis (H. S. French, Zurich, 1914) are reviewed that deal with hitherto unpublished UV-vis spectroscopy work of coordination compounds in the group of Alfred Werner. The method of measurement of UV-vis spectra at Alfred Werner's time is described in detail. Examples of spectra of complexes are given, which were partly interpreted in terms of structure (cis ↔ trans configuration, counting number of bands for structural relationships, and shift of general spectral features by consecutive replacement of ligands). A more complete interpretation of spectra was hampered at Alfred Werner's time by the lack of a light absorption theory and a correct theory of electron excitation, and the lack of a ligand field theory for coordination compounds. The experimentally difficult data acquisitions and the difficult spectral interpretations might have been reasons why this method did not experience a breakthrough in Alfred Werner's group to play a more prominent role as an important analytical method. Nevertheless the application of UV-vis spectroscopy on coordination compounds was unique and novel, and witnesses Alfred Werner's great aptitude and keenness to always try and go beyond conventional practice.

  11. ExDom: an integrated database for comparative analysis of the exon–intron structures of protein domains in eukaryotes

    PubMed Central

    Bhasi, Ashwini; Philip, Philge; Manikandan, Vinu; Senapathy, Periannan

    2009-01-01

    We have developed ExDom, a unique database for the comparative analysis of the exon–intron structures of 96 680 protein domains from seven eukaryotic organisms (Homo sapiens, Mus musculus, Bos taurus, Rattus norvegicus, Danio rerio, Gallus gallus and Arabidopsis thaliana). ExDom provides integrated access to exon-domain data through a sophisticated web interface which has the following analytical capabilities: (i) intergenomic and intragenomic comparative analysis of exon–intron structure of domains; (ii) color-coded graphical display of the domain architecture of proteins correlated with their corresponding exon-intron structures; (iii) graphical analysis of multiple sequence alignments of amino acid and coding nucleotide sequences of homologous protein domains from seven organisms; (iv) comparative graphical display of exon distributions within the tertiary structures of protein domains; and (v) visualization of exon–intron structures of alternative transcripts of a gene correlated to variations in the domain architecture of corresponding protein isoforms. These novel analytical features are highly suited for detailed investigations on the exon–intron structure of domains and make ExDom a powerful tool for exploring several key questions concerning the function, origin and evolution of genes and proteins. ExDom database is freely accessible at: http://66.170.16.154/ExDom/. PMID:18984624

  12. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  13. Visualizing statistical significance of disease clusters using cartograms.

    PubMed

    Kronenfeld, Barry J; Wong, David W S

    2017-05-15

    Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.

  14. Metabolonote: A Wiki-Based Database for Managing Hierarchical Metadata of Metabolome Analyses

    PubMed Central

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics – technology for comprehensive detection of small molecules in an organism – lags behind the other “omics” in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called “Togo Metabolome Data” (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers’ understanding and use of data but also submitters’ motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/. PMID:25905099

  15. Metabolonote: a wiki-based database for managing hierarchical metadata of metabolome analyses.

    PubMed

    Ara, Takeshi; Enomoto, Mitsuo; Arita, Masanori; Ikeda, Chiaki; Kera, Kota; Yamada, Manabu; Nishioka, Takaaki; Ikeda, Tasuku; Nihei, Yoshito; Shibata, Daisuke; Kanaya, Shigehiko; Sakurai, Nozomu

    2015-01-01

    Metabolomics - technology for comprehensive detection of small molecules in an organism - lags behind the other "omics" in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata), existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data) being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called "Togo Metabolome Data" (TogoMD), with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitate the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/.

  16. A Reasoning And Hypothesis-Generation Framework Based On Scalable Graph Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas Rangan

    Finding actionable insights from data has always been difficult. As the scale and forms of data increase tremendously, the task of finding value becomes even more challenging. Data scientists at Oak Ridge National Laboratory are leveraging unique leadership infrastructure (e.g. Urika-XA and Urika-GD appliances) to develop scalable algorithms for semantic, logical and statistical reasoning with unstructured Big Data. We present the deployment of such a framework called ORiGAMI (Oak Ridge Graph Analytics for Medical Innovations) on the National Library of Medicine s SEMANTIC Medline (archive of medical knowledge since 1994). Medline contains over 70 million knowledge nuggets published in 23.5more » million papers in medical literature with thousands more added daily. ORiGAMI is available as an open-science medical hypothesis generation tool - both as a web-service and an application programming interface (API) at http://hypothesis.ornl.gov . Since becoming an online service, ORIGAMI has enabled clinical subject-matter experts to: (i) discover the relationship between beta-blocker treatment and diabetic retinopathy; (ii) hypothesize that xylene is an environmental cancer-causing carcinogen and (iii) aid doctors with diagnosis of challenging cases when rare diseases manifest with common symptoms. In 2015, ORiGAMI was featured in the Historical Clinical Pathological Conference in Baltimore as a demonstration of artificial intelligence to medicine, IEEE/ACM Supercomputing and recognized as a Centennial Showcase Exhibit at the Radiological Society of North America (RSNA) Conference in Chicago. The final paper will describe the workflow built for the Cray Urika-XA and Urika-GD appliances that is able to reason with the knowledge of every published medical paper every time a clinical researcher uses the tool.« less

  17. Analytical solutions of one-dimensional multispecies reactive transport in a permeable reactive barrier-aquifer system

    NASA Astrophysics Data System (ADS)

    Mieles, John; Zhan, Hongbin

    2012-06-01

    The permeable reactive barrier (PRB) remediation technology has proven to be more cost-effective than conventional pump-and-treat systems, and has demonstrated the ability to rapidly reduce the concentrations of specific chemicals of concern (COCs) by up to several orders of magnitude in some scenarios. This study derives new steady-state analytical solutions to multispecies reactive transport in a PRB-aquifer (dual domain) system. The advantage of the dual domain model is that it can account for the potential existence of natural degradation in the aquifer, when designing the required PRB thickness. The study focuses primarily on the steady-state analytical solutions of the tetrachloroethene (PCE) serial degradation pathway and secondly on the analytical solutions of the parallel degradation pathway. The solutions in this study can also be applied to other types of dual domain systems with distinct flow and transport properties. The steady-state analytical solutions are shown to be accurate and the numerical program RT3D is selected for comparison. The results of this study are novel in that the solutions provide improved modeling flexibility including: 1) every species can have unique first-order reaction rates and unique retardation factors, and 2) daughter species can be modeled with their individual input concentrations or solely as byproducts of the parent species. The steady-state analytical solutions exhibit a limitation that occurs when interspecies reaction rate factors equal each other, which result in undefined solutions. Excel spreadsheet programs were created to facilitate prompt application of the steady-state analytical solutions, for both the serial and parallel degradation pathways.

  18. Data Intensive Computing on Amazon Web Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, S. A.

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less

  19. Haze Gray Paint and the U.S. Navy: A Procurement Process Review

    DTIC Science & Technology

    2017-12-01

    support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane

  20. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and

  1. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  2. Dynamic Vision for Control

    DTIC Science & Technology

    2006-07-27

    unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry

  3. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory

  4. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  5. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  6. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  7. Distributed Generation Interconnection Collaborative | NREL

    Science.gov Websites

    , reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability

  8. CCMC: Serving research and space weather communities with unique space weather services, innovative tools and resources

    NASA Astrophysics Data System (ADS)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti; Maddox, Marlo

    2015-04-01

    With the addition of Space Weather Research Center (a sub-team within CCMC) in 2010 to address NASA’s own space weather needs, CCMC has become a unique entity that not only facilitates research through providing access to the state-of-the-art space science and space weather models, but also plays a critical role in providing unique space weather services to NASA robotic missions, developing innovative tools and transitioning research to operations via user feedback. With scientists, forecasters and software developers working together within one team, through close and direct connection with space weather customers and trusted relationship with model developers, CCMC is flexible, nimble and effective to meet customer needs. In this presentation, we highlight a few unique aspects of CCMC/SWRC’s space weather services, such as addressing space weather throughout the solar system, pushing the frontier of space weather forecasting via the ensemble approach, providing direct personnel and tool support for spacecraft anomaly resolution, prompting development of multi-purpose tools and knowledge bases, and educating and engaging the next generation of space weather scientists.

  9. Integration of gas chromatography mass spectrometry methods for differentiating ricin preparation methods.

    PubMed

    Wunschel, David S; Melville, Angela M; Ehrhardt, Christopher J; Colburn, Heather A; Victry, Kristin D; Antolick, Kathryn C; Wahl, Jon H; Wahl, Karen L

    2012-05-07

    The investigation of crimes involving chemical or biological agents is infrequent, but presents unique analytical challenges. The protein toxin ricin is encountered more frequently than other agents and is found in the seeds of Ricinus communis, commonly known as the castor plant. Typically, the toxin is extracted from castor seeds utilizing a variety of different recipes that result in varying purity of the toxin. Moreover, these various purification steps can also leave or differentially remove a variety of exogenous and endogenous residual components with the toxin that may indicate the type and number of purification steps involved. We have applied three gas chromatography-mass spectrometry (GC-MS) based analytical methods to measure the variation in seed carbohydrates and castor oil ricinoleic acid, as well as the presence of solvents used for purification. These methods were applied to the same samples prepared using four previously identified toxin preparation methods, starting from four varieties of castor seeds. The individual data sets for seed carbohydrate profiles, ricinoleic acid, or acetone amount each provided information capable of differentiating different types of toxin preparations across seed types. However, the integration of the data sets using multivariate factor analysis provided a clear distinction of all samples based on the preparation method, independent of the seed source. In particular, the abundance of mannose, arabinose, fucose, ricinoleic acid, and acetone were shown to be important differentiating factors. These complementary tools provide a more confident determination of the method of toxin preparation than would be possible using a single analytical method.

  10. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  11. Analysis of Long-Term Temperature Variations in the Human Body.

    PubMed

    Dakappa, Pradeepa Hoskeri; Mahabala, Chakrapani

    2015-01-01

    Body temperature is a continuous physiological variable. In normal healthy adults, oral temperature is estimated to vary between 36.1°C and 37.2°C. Fever is a complex host response to many external and internal agents and is a potential contributor to many clinical conditions. Despite being one of the foremost vital signs, temperature and its analysis and variations during many pathological conditions has yet to be examined in detail using mathematical techniques. Classical fever patterns based on recordings obtained every 8-12 h have been developed. However, such patterns do not provide meaningful information in diagnosing diseases. Because fever is a host response, it is likely that there could be a unique response to specific etiologies. Continuous long-term temperature monitoring and pattern analysis using specific analytical methods developed in engineering and physics could aid in revealing unique fever responses of hosts and in different clinical conditions. Furthermore, such analysis can potentially be used as a novel diagnostic tool and to study the effect of pharmaceutical agents and other therapeutic protocols. Thus, the goal of our article is to present a comprehensive review of the recent relevant literature and analyze the current state of research regarding temperature variations in the human body.

  12. The Prescription Opioid Epidemic: Social Media Responses to the Residents' Perspective Article.

    PubMed

    Choo, Esther K; Mazer-Amirshahi, Maryann; Juurlink, David; Kobner, Scott; Scott, Kevin; Lin, Michelle

    2016-01-01

    In June 2014, Annals of Emergency Medicine collaborated with the Academic Life in Emergency Medicine (ALiEM) blog-based Web site to host an online discussion session featuring the Annals Residents' Perspective article "The Opioid Prescription Epidemic and the Role of Emergency Medicine" by Poon and Greenwood-Ericksen. This dialogue included a live videocast with the authors and other experts, a detailed discussion on the ALiEM Web site's comment section, and real-time conversations on Twitter. Engagement was tracked through various Web analytic tools, and themes were identified by content curation. The dialogue resulted in 1,262 unique page views from 433 cities in 41 countries on the ALiEM Web site, 408,498 Twitter impressions, and 168 views of the video interview with the authors. Four major themes about prescription opioids identified included the following: physician knowledge, inconsistent medical education, balance between overprescribing and effective pain management, and approaches to solutions. Free social media technologies provide a unique opportunity to engage with a diverse community of emergency medicine and non-emergency medicine clinicians, nurses, learners, and even patients. Such technologies may allow more rapid hypothesis generation for future research and more accelerated knowledge translation. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  13. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) atmore » Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.« less

  15. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  16. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  17. Modeling of the Global Water Cycle - Analytical Models

    Treesearch

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  18. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  19. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  20. Physics Mining of Multi-Source Data Sets

    NASA Technical Reports Server (NTRS)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  1. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    PubMed

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .

  2. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  3. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  4. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  5. Drawing as a Unique Mental Development Tool for Young Children: Interpersonal and Intrapersonal Dialogues

    ERIC Educational Resources Information Center

    Brooks, Margaret

    2005-01-01

    Using examples from children drawing in a year one classroom, this article examines firstly, how drawing operates as a unique mental tool, and secondly, the role of drawing in the construction and development of knowledge. Young children utilize prior knowledge and experience to negotiate and construct meaning through their interactions with…

  6. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  7. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  8. Tunable Ionization Modes of a Flowing Atmospheric-Pressure Afterglow (FAPA) Ambient Ionization Source.

    PubMed

    Badal, Sunil P; Michalak, Shawn D; Chan, George C-Y; You, Yi; Shelley, Jacob T

    2016-04-05

    Plasma-based ambient desorption/ionization sources are versatile in that they enable direct ionization of gaseous samples as well as desorption/ionization of analytes from liquid and solid samples. However, ionization matrix effects, caused by competitive ionization processes, can worsen sensitivity or even inhibit detection all together. The present study is focused on expanding the analytical capabilities of the flowing atmospheric-pressure afterglow (FAPA) source by exploring additional types of ionization chemistry. Specifically, it was found that the abundance and type of reagent ions produced by the FAPA source and, thus, the corresponding ionization pathways of analytes, can be altered by changing the source working conditions. High abundance of proton-transfer reagent ions was observed with relatively high gas flow rates and low discharge currents. Conversely, charge-transfer reagent species were most abundant at low gas flows and high discharge currents. A rather nonpolar model analyte, biphenyl, was found to significantly change ionization pathway based on source operating parameters. Different analyte ions (e.g., MH(+) via proton-transfer and M(+.) via charge-transfer) were formed under unique operating parameters demonstrating two different operating regimes. These tunable ionization modes of the FAPA were used to enable or enhance detection of analytes which traditionally exhibit low-sensitivity in plasma-based ADI-MS analyses. In one example, 2,2'-dichloroquaterphenyl was detected under charge-transfer FAPA conditions, which were difficult or impossible to detect with proton-transfer FAPA or direct analysis in real-time (DART). Overall, this unique mode of operation increases the number and range of detectable analytes and has the potential to lessen ionization matrix effects in ADI-MS analyses.

  9. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  10. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  11. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  12. Modeling Multi-wavelength Stellar Astrometry. III. Determination of the Absolute Masses of Exoplanets and Their Host Stars

    NASA Astrophysics Data System (ADS)

    Coughlin, J. L.; López-Morales, Mercedes

    2012-05-01

    Astrometric measurements of stellar systems are becoming significantly more precise and common, with many ground- and space-based instruments and missions approaching 1 μas precision. We examine the multi-wavelength astrometric orbits of exoplanetary systems via both analytical formulae and numerical modeling. Exoplanets have a combination of reflected and thermally emitted light that causes the photocenter of the system to shift increasingly farther away from the host star with increasing wavelength. We find that, if observed at long enough wavelengths, the planet can dominate the astrometric motion of the system, and thus it is possible to directly measure the orbits of both the planet and star, and thus directly determine the physical masses of the star and planet, using multi-wavelength astrometry. In general, this technique works best for, though is certainly not limited to, systems that have large, high-mass stars and large, low-mass planets, which is a unique parameter space not covered by other exoplanet characterization techniques. Exoplanets that happen to transit their host star present unique cases where the physical radii of the planet and star can be directly determined via astrometry alone. Planetary albedos and day-night contrast ratios may also be probed via this technique due to the unique signature they impart on the observed astrometric orbits. We develop a tool to examine the prospects for near-term detection of this effect, and give examples of some exoplanets that appear to be good targets for detection in the K to N infrared observing bands, if the required precision can be achieved.

  13. Semi-Analytical Models of CO2 Injection into Deep Saline Aquifers: Evaluation of the Area of Review and Leakage through Abandoned Wells

    EPA Science Inventory

    This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...

  14. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  15. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  16. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  17. Magnetic ionic liquids in analytical chemistry: A review.

    PubMed

    Clark, Kevin D; Nacham, Omprakash; Purslow, Jeffrey A; Pierson, Stephen A; Anderson, Jared L

    2016-08-31

    Magnetic ionic liquids (MILs) have recently generated a cascade of innovative applications in numerous areas of analytical chemistry. By incorporating a paramagnetic component within the cation or anion, MILs exhibit a strong response toward external magnetic fields. Careful design of the MIL structure has yielded magnetoactive compounds with unique physicochemical properties including high magnetic moments, enhanced hydrophobicity, and the ability to solvate a broad range of molecules. The structural tunability and paramagnetic properties of MILs have enabled magnet-based technologies that can easily be added to the analytical method workflow, complement needed extraction requirements, or target specific analytes. This review highlights the application of MILs in analytical chemistry and examines the important structural features of MILs that largely influence their physicochemical and magnetic properties. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Tool making, hand morphology and fossil hominins.

    PubMed

    Marzke, Mary W

    2013-11-19

    Was stone tool making a factor in the evolution of human hand morphology? Is it possible to find evidence in fossil hominin hands for this capability? These questions are being addressed with increasingly sophisticated studies that are testing two hypotheses; (i) that humans have unique patterns of grip and hand movement capabilities compatible with effective stone tool making and use of the tools and, if this is the case, (ii) that there exist unique patterns of morphology in human hands that are consistent with these capabilities. Comparative analyses of human stone tool behaviours and chimpanzee feeding behaviours have revealed a distinctive set of forceful pinch grips by humans that are effective in the control of stones by one hand during manufacture and use of the tools. Comparative dissections, kinematic analyses and biomechanical studies indicate that humans do have a unique pattern of muscle architecture and joint surface form and functions consistent with the derived capabilities. A major remaining challenge is to identify skeletal features that reflect the full morphological pattern, and therefore may serve as clues to fossil hominin manipulative capabilities. Hominin fossils are evaluated for evidence of patterns of derived human grip and stress-accommodation features.

  19. Tool making, hand morphology and fossil hominins

    PubMed Central

    Marzke, Mary W.

    2013-01-01

    Was stone tool making a factor in the evolution of human hand morphology? Is it possible to find evidence in fossil hominin hands for this capability? These questions are being addressed with increasingly sophisticated studies that are testing two hypotheses; (i) that humans have unique patterns of grip and hand movement capabilities compatible with effective stone tool making and use of the tools and, if this is the case, (ii) that there exist unique patterns of morphology in human hands that are consistent with these capabilities. Comparative analyses of human stone tool behaviours and chimpanzee feeding behaviours have revealed a distinctive set of forceful pinch grips by humans that are effective in the control of stones by one hand during manufacture and use of the tools. Comparative dissections, kinematic analyses and biomechanical studies indicate that humans do have a unique pattern of muscle architecture and joint surface form and functions consistent with the derived capabilities. A major remaining challenge is to identify skeletal features that reflect the full morphological pattern, and therefore may serve as clues to fossil hominin manipulative capabilities. Hominin fossils are evaluated for evidence of patterns of derived human grip and stress-accommodation features. PMID:24101624

  20. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    DOT National Transportation Integrated Search

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  1. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  2. 17 CFR 49.17 - Access to SDR data.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...

  3. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  4. Art, Meet Chemistry; Chemistry, Meet Art: Case Studies, Current Literature, and Instrumental Methods Combined to Create a Hands-On Experience for Nonmajors and Instrumental Analysis Students

    ERIC Educational Resources Information Center

    Nivens, Delana A.; Padgett, Clifford W.; Chase, Jeffery M.; Verges, Katie J.; Jamieson, Deborah S.

    2010-01-01

    Case studies and current literature are combined with spectroscopic analysis to provide a unique chemistry experience for art history students and to provide a unique inquiry-based laboratory experiment for analytical chemistry students. The XRF analysis method was used to demonstrate to nonscience majors (art history students) a powerful…

  5. MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.

    PubMed

    Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui

    2015-12-12

    Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.

  6. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  8. Update on SLD Engineering Tools Development

    NASA Technical Reports Server (NTRS)

    Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.

    2004-01-01

    The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.

  9. 76 FR 70517 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...

  10. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  11. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  12. SU-C-204-01: A Fast Analytical Approach for Prompt Gamma and PET Predictions in a TPS for Proton Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, K; Herzog, M; Landry, G

    2015-06-15

    Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less

  13. Applications of Raman Spectroscopy in Biopharmaceutical Manufacturing: A Short Review.

    PubMed

    Buckley, Kevin; Ryder, Alan G

    2017-06-01

    The production of active pharmaceutical ingredients (APIs) is currently undergoing its biggest transformation in a century. The changes are based on the rapid and dramatic introduction of protein- and macromolecule-based drugs (collectively known as biopharmaceuticals) and can be traced back to the huge investment in biomedical science (in particular in genomics and proteomics) that has been ongoing since the 1970s. Biopharmaceuticals (or biologics) are manufactured using biological-expression systems (such as mammalian, bacterial, insect cells, etc.) and have spawned a large (>€35 billion sales annually in Europe) and growing biopharmaceutical industry (BioPharma). The structural and chemical complexity of biologics, combined with the intricacy of cell-based manufacturing, imposes a huge analytical burden to correctly characterize and quantify both processes (upstream) and products (downstream). In small molecule manufacturing, advances in analytical and computational methods have been extensively exploited to generate process analytical technologies (PAT) that are now used for routine process control, leading to more efficient processes and safer medicines. In the analytical domain, biologic manufacturing is considerably behind and there is both a huge scope and need to produce relevant PAT tools with which to better control processes, and better characterize product macromolecules. Raman spectroscopy, a vibrational spectroscopy with a number of useful properties (nondestructive, non-contact, robustness) has significant potential advantages in BioPharma. Key among them are intrinsically high molecular specificity, the ability to measure in water, the requirement for minimal (or no) sample pre-treatment, the flexibility of sampling configurations, and suitability for automation. Here, we review and discuss a representative selection of the more important Raman applications in BioPharma (with particular emphasis on mammalian cell culture). The review shows that the properties of Raman have been successfully exploited to deliver unique and useful analytical solutions, particularly for online process monitoring. However, it also shows that its inherent susceptibility to fluorescence interference and the weakness of the Raman effect mean that it can never be a panacea. In particular, Raman-based methods are intrinsically limited by the chemical complexity and wide analyte-concentration-profiles of cell culture media/bioprocessing broths which limit their use for quantitative analysis. Nevertheless, with appropriate foreknowledge of these limitations and good experimental design, robust analytical methods can be produced. In addition, new technological developments such as time-resolved detectors, advanced lasers, and plasmonics offer potential of new Raman-based methods to resolve existing limitations and/or provide new analytical insights.

  14. Predicted Arabidopsis Interactome Resource and Gene Set Linkage Analysis: A Transcriptomic Analysis Resource.

    PubMed

    Yao, Heng; Wang, Xiaoxuan; Chen, Pengcheng; Hai, Ling; Jin, Kang; Yao, Lixia; Mao, Chuanzao; Chen, Xin

    2018-05-01

    An advanced functional understanding of omics data is important for elucidating the design logic of physiological processes in plants and effectively controlling desired traits in plants. We present the latest versions of the Predicted Arabidopsis Interactome Resource (PAIR) and of the gene set linkage analysis (GSLA) tool, which enable the interpretation of an observed transcriptomic change (differentially expressed genes [DEGs]) in Arabidopsis ( Arabidopsis thaliana ) with respect to its functional impact for biological processes. PAIR version 5.0 integrates functional association data between genes in multiple forms and infers 335,301 putative functional interactions. GSLA relies on this high-confidence inferred functional association network to expand our perception of the functional impacts of an observed transcriptomic change. GSLA then interprets the biological significance of the observed DEGs using established biological concepts (annotation terms), describing not only the DEGs themselves but also their potential functional impacts. This unique analytical capability can help researchers gain deeper insights into their experimental results and highlight prospective directions for further investigation. We demonstrate the utility of GSLA with two case studies in which GSLA uncovered how molecular events may have caused physiological changes through their collective functional influence on biological processes. Furthermore, we showed that typical annotation-enrichment tools were unable to produce similar insights to PAIR/GSLA. The PAIR version 5.0-inferred interactome and GSLA Web tool both can be accessed at http://public.synergylab.cn/pair/. © 2018 American Society of Plant Biologists. All Rights Reserved.

  15. Web-based assessments of physical activity in youth: considerations for design and scale calibration.

    PubMed

    Saint-Maurice, Pedro F; Welk, Gregory J

    2014-12-01

    This paper describes the design and methods involved in calibrating a Web-based self-report instrument to estimate physical activity behavior. The limitations of self-report measures are well known, but calibration methods enable the reported information to be equated to estimates obtained from objective data. This paper summarizes design considerations for effective development and calibration of physical activity self-report measures. Each of the design considerations is put into context and followed by a practical application based on our ongoing calibration research with a promising online self-report tool called the Youth Activity Profile (YAP). We first describe the overall concept of calibration and how this influences the selection of appropriate self-report tools for this population. We point out the advantages and disadvantages of different monitoring devices since the choice of the criterion measure and the strategies used to minimize error in the measure can dramatically improve the quality of the data. We summarize strategies to ensure quality control in data collection and discuss analytical considerations involved in group- vs individual-level inference. For cross-validation procedures, we describe the advantages of equivalence testing procedures that directly test and quantify agreement. Lastly, we introduce the unique challenges encountered when transitioning from paper to a Web-based tool. The Web offers considerable potential for broad adoption but an iterative calibration approach focused on continued refinement is needed to ensure that estimates are generalizable across individuals, regions, seasons and countries.

  16. Analytical approximations of the firing rate of an adaptive exponential integrate-and-fire neuron in the presence of synaptic noise.

    PubMed

    Hertäg, Loreen; Durstewitz, Daniel; Brunel, Nicolas

    2014-01-01

    Computational models offer a unique tool for understanding the network-dynamical mechanisms which mediate between physiological and biophysical properties, and behavioral function. A traditional challenge in computational neuroscience is, however, that simple neuronal models which can be studied analytically fail to reproduce the diversity of electrophysiological behaviors seen in real neurons, while detailed neuronal models which do reproduce such diversity are intractable analytically and computationally expensive. A number of intermediate models have been proposed whose aim is to capture the diversity of firing behaviors and spike times of real neurons while entailing the simplest possible mathematical description. One such model is the exponential integrate-and-fire neuron with spike rate adaptation (aEIF) which consists of two differential equations for the membrane potential (V) and an adaptation current (w). Despite its simplicity, it can reproduce a wide variety of physiologically observed spiking patterns, can be fit to physiological recordings quantitatively, and, once done so, is able to predict spike times on traces not used for model fitting. Here we compute the steady-state firing rate of aEIF in the presence of Gaussian synaptic noise, using two approaches. The first approach is based on the 2-dimensional Fokker-Planck equation that describes the (V,w)-probability distribution, which is solved using an expansion in the ratio between the time constants of the two variables. The second is based on the firing rate of the EIF model, which is averaged over the distribution of the w variable. These analytically derived closed-form expressions were tested on simulations from a large variety of model cells quantitatively fitted to in vitro electrophysiological recordings from pyramidal cells and interneurons. Theoretical predictions closely agreed with the firing rate of the simulated cells fed with in-vivo-like synaptic noise.

  17. Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques

    NASA Astrophysics Data System (ADS)

    Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman

    2011-06-01

    This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.

  18. THE FUTURE OF SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): 2006-2010

    EPA Science Inventory

    SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...

  19. Paper SERS chromatography for detection of trace analytes in complex samples

    NASA Astrophysics Data System (ADS)

    Yu, Wei W.; White, Ian M.

    2013-05-01

    We report the application of paper SERS substrates for the detection of trace quantities of multiple analytes in a complex sample in the form of paper chromatography. Paper chromatography facilitates the separation of different analytes from a complex sample into distinct sections in the chromatogram, which can then be uniquely identified using SERS. As an example, the separation and quantitative detection of heroin in a highly fluorescent mixture is demonstrated. Paper SERS chromatography has obvious applications, including law enforcement, food safety, and border protection, and facilitates the rapid detection of chemical and biological threats at the point of sample.

  20. NCI-FDA Interagency Oncology Task Force Workshop Provides Guidance for Analytical Validation of Protein-based Multiplex Assays | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to

  1. Multi-Instrument Characterization of the Surfaces and Materials in Microfabricated, Carbon Nanotube-Templated Thin Layer Chromatography Plates. An Analogy to ‘The Blind Men and the Elephant’

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, David S.; Kanyal, Supriya S.; Madaan, Nitesh

    Herein we apply a suite of surface/materials analytical tools to characterize some of the materials created in the production of microfabricated thin layer chromatography plates. Techniques used include X-ray photoelectron spectroscopy (XPS), valence band spectroscopy, static time-of-flight secondary ion spectrometry (ToF-SIMS) in both positive and negative ion modes, Rutherford backscattering spectroscopy (RBS), and helium ion microscopy (HIM). Materials characterized include: the Si(100) substrate with native oxide: Si/SiO2, alumina (35 nm) deposited as a diffusion barrier on the Si/SiO2: Si/SiO2/Al2O3, iron (6 nm) thermally evaporated on the Al2O3: Si/SiO2/Al2O3/Fe, the iron film annealed in H2 to make Fe catalyst nanoparticles: Si/SiO2/Al2O3/Fe(NP),more » and carbon nanotubes (CNTs) grown from the Fe nanoparticles: Si/SiO2/Al2O3/Fe(NP)/CNT. The Fe thin films and nanoparticles are found in an oxidized state. Some of the analyses of the CNTs/CNT forests reported appear to be unique: the CNT forest appears to exhibit an interesting ‘channeling’ phenomenon by RBS, we observe an odd-even effect in the ToF-SIMS spectra of Cn- species for n = 1 – 6, with ions at even n showing greater intensity than the neighboring signals, and ions with n ≥ 6 showing a steady decrease in intensity, and valence band characterization of CNTs using X-radiation is reported. The information obtained from the combination of the different analytical tools provides a more complete understanding of our materials than a single technique, which is analogous to the story of ‘The Blind Men and the Elephant’. (Of course there is increasing emphasis on the use of multiple characterization tools in surface and materials analysis.) The raw XPS and ToF-SIMS spectra from this study will be submitted to Surface Science Spectra for archiving.« less

  2. Farm Management Support on Cloud Computing Platform: A System for Cropland Monitoring Using Multi-Source Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Coburn, C. A.; Qin, Y.; Zhang, J.; Staenz, K.

    2015-12-01

    Food security is one of the most pressing issues facing humankind. Recent estimates predict that over one billion people don't have enough food to meet their basic nutritional needs. The ability of remote sensing tools to monitor and model crop production and predict crop yield is essential for providing governments and farmers with vital information to ensure food security. Google Earth Engine (GEE) is a cloud computing platform, which integrates storage and processing algorithms for massive remotely sensed imagery and vector data sets. By providing the capabilities of storing and analyzing the data sets, it provides an ideal platform for the development of advanced analytic tools for extracting key variables used in regional and national food security systems. With the high performance computing and storing capabilities of GEE, a cloud-computing based system for near real-time crop land monitoring was developed using multi-source remotely sensed data over large areas. The system is able to process and visualize the MODIS time series NDVI profile in conjunction with Landsat 8 image segmentation for crop monitoring. With multi-temporal Landsat 8 imagery, the crop fields are extracted using the image segmentation algorithm developed by Baatz et al.[1]. The MODIS time series NDVI data are modeled by TIMESAT [2], a software package developed for analyzing time series of satellite data. The seasonality of MODIS time series data, for example, the start date of the growing season, length of growing season, and NDVI peak at a field-level are obtained for evaluating the crop-growth conditions. The system fuses MODIS time series NDVI data and Landsat 8 imagery to provide information of near real-time crop-growth conditions through the visualization of MODIS NDVI time series and comparison of multi-year NDVI profiles. Stakeholders, i.e., farmers and government officers, are able to obtain crop-growth information at crop-field level online. This unique utilization of GEE in combination with advanced analytic and extraction techniques provides a vital remote sensing tool for decision makers and scientists with a high-degree of flexibility to adapt to different uses.

  3. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays.

    PubMed

    Kimura, Yasumasa; Soma, Takahiro; Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J L; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download.

  4. Hidden Costs: the ethics of cost-effectiveness analyses for health interventions in resource-limited settings

    PubMed Central

    Rutstein, Sarah E.; Price, Joan T.; Rosenberg, Nora E.; Rennie, Stuart M.; Biddle, Andrea K.; Miller, William C.

    2017-01-01

    Cost-effectiveness analysis (CEA) is an increasingly appealing tool for evaluating and comparing health-related interventions in resource-limited settings. The goal is to inform decision-makers regarding the health benefits and associated costs of alternative interventions, helping guide allocation of limited resources by prioritizing interventions that offer the most health for the least money. Although only one component of a more complex decision-making process, CEAs influence the distribution of healthcare resources, directly influencing morbidity and mortality for the world’s most vulnerable populations. However, CEA-associated measures are frequently setting-specific valuations, and CEA outcomes may violate ethical principles of equity and distributive justice. We examine the assumptions and analytical tools used in CEAs that may conflict with societal values. We then evaluate contextual features unique to resource-limited settings, including the source of health-state utilities and disability weights; implications of CEA thresholds in light of economic uncertainty; and the role of external donors. Finally, we explore opportunities to help align interpretation of CEA outcomes with values and budgetary constraints in resource-limited settings. The ethical implications of CEAs in resource-limited settings are vast. It is imperative that CEA outcome summary measures and implementation thresholds adequately reflect societal values and ethical priorities in resource-limited settings. PMID:27141969

  5. Edesign: Primer and Enhanced Internal Probe Design Tool for Quantitative PCR Experiments and Genotyping Assays

    PubMed Central

    Kasahara, Naoko; Delobel, Diane; Hanami, Takeshi; Tanaka, Yuki; de Hoon, Michiel J. L.; Hayashizaki, Yoshihide; Usui, Kengo; Harbers, Matthias

    2016-01-01

    Analytical PCR experiments preferably use internal probes for monitoring the amplification reaction and specific detection of the amplicon. Such internal probes have to be designed in close context with the amplification primers, and may require additional considerations for the detection of genetic variations. Here we describe Edesign, a new online and stand-alone tool for designing sets of PCR primers together with an internal probe for conducting quantitative real-time PCR (qPCR) and genotypic experiments. Edesign can be used for selecting standard DNA oligonucleotides like for instance TaqMan probes, but has been further extended with new functions and enhanced design features for Eprobes. Eprobes, with their single thiazole orange-labelled nucleotide, allow for highly sensitive genotypic assays because of their higher DNA binding affinity as compared to standard DNA oligonucleotides. Using new thermodynamic parameters, Edesign considers unique features of Eprobes during primer and probe design for establishing qPCR experiments and genotyping by melting curve analysis. Additional functions in Edesign allow probe design for effective discrimination between wild-type sequences and genetic variations either using standard DNA oligonucleotides or Eprobes. Edesign can be freely accessed online at http://www.dnaform.com/edesign2/, and the source code is available for download. PMID:26863543

  6. Contributions of adipose tissue architectural and tensile properties toward defining healthy and unhealthy obesity.

    PubMed

    Lackey, Denise E; Burk, David H; Ali, Mohamed R; Mostaedi, Rouzbeh; Smith, William H; Park, Jiyoung; Scherer, Philipp E; Seay, Shundra A; McCoin, Colin S; Bonaldo, Paolo; Adams, Sean H

    2014-02-01

    The extracellular matrix (ECM) plays an important role in the maintenance of white adipose tissue (WAT) architecture and function, and proper ECM remodeling is critical to support WAT malleability to accommodate changes in energy storage needs. Obesity and adipocyte hypertrophy place a strain on the ECM remodeling machinery, which may promote disordered ECM and altered tissue integrity and could promote proinflammatory and cell stress signals. To explore these questions, new methods were developed to quantify omental and subcutaneous WAT tensile strength and WAT collagen content by three-dimensional confocal imaging, using collagen VI knockout mice as a methods validation tool. These methods, combined with comprehensive measurement of WAT ECM proteolytic enzymes, transcript, and blood analyte analyses, were used to identify unique pathophenotypes of metabolic syndrome and type 2 diabetes mellitus in obese women, using multivariate statistical modeling and univariate comparisons with weight-matched healthy obese individuals. In addition to the expected differences in inflammation and glycemic control, approximately 20 ECM-related factors, including omental tensile strength, collagen, and enzyme transcripts, helped discriminate metabolically compromised obesity. This is consistent with the hypothesis that WAT ECM physiology is intimately linked to metabolic health in obese humans, and the studies provide new tools to explore this relationship.

  7. Proposed Facility Modifications to Support Propulsion Systems Testing Under Simulated Space Conditions at Plum Brook Station's Spacecraft Propulsion Research Facility (B-2)

    NASA Technical Reports Server (NTRS)

    Edwards, Daryl A.

    2008-01-01

    Preparing NASA's Plum Brook Station's Spacecraft Propulsion Research Facility (B-2) to support NASA's new generation of launch vehicles has raised many challenges for B-2's support staff. The facility provides a unique capability to test chemical propulsion systems/vehicles while simulating space thermal and vacuum environments. Designed and constructed in the early 1960s to support upper stage cryogenic engine/vehicle system development, the Plum Brook Station B-2 facility will require modifications to support the larger, more powerful, and more advanced engine systems for the next generation of vehicles leaving earth's orbit. Engine design improvements over the years have included large area expansion ratio nozzles, greater combustion chamber pressures, and advanced materials. Consequently, it has become necessary to determine what facility changes are required and how the facility can be adapted to support varying customers and their specific test needs. Exhaust system performance, including understanding the present facility capabilities, is the primary focus of this work. A variety of approaches and analytical tools are being employed to gain this understanding. This presentation discusses some of the challenges in applying these tools to this project and expected facility configuration to support the varying customer needs.

  8. Proposed Facility Modifications to Support Propulsion Systems Testing Under Simulated Space Conditions at Plum Brook Station's Spacecraft Propulsion Research Facility (B-2)

    NASA Technical Reports Server (NTRS)

    Edwards, Daryl A.

    2007-01-01

    Preparing NASA's Plum Brook Station's Spacecraft Propulsion Research Facility (B-2) to support NASA's new generation of launch vehicles has raised many challenges for B-2 s support staff. The facility provides a unique capability to test chemical propulsion systems/vehicles while simulating space thermal and vacuum environments. Designed and constructed 4 decades ago to support upper stage cryogenic engine/vehicle system development, the Plum Brook Station B-2 facility will require modifications to support the larger, more powerful, and more advanced engine systems for the next generation of vehicles leaving earth's orbit. Engine design improvements over the years have included large area expansion ratio nozzles, greater combustion chamber pressures, and advanced materials. Consequently, it has become necessary to determine what facility changes are required and how the facility can be adapted to support varying customers and their specific test needs. Instrumental in this task is understanding the present facility capabilities and identifying what reasonable changes can be implemented. A variety of approaches and analytical tools are being employed to gain this understanding. This paper discusses some of the challenges in applying these tools to this project and expected facility configuration to support the varying customer needs.

  9. Hidden costs: The ethics of cost-effectiveness analyses for health interventions in resource-limited settings.

    PubMed

    Rutstein, Sarah E; Price, Joan T; Rosenberg, Nora E; Rennie, Stuart M; Biddle, Andrea K; Miller, William C

    2017-10-01

    Cost-effectiveness analysis (CEA) is an increasingly appealing tool for evaluating and comparing health-related interventions in resource-limited settings. The goal is to inform decision-makers regarding the health benefits and associated costs of alternative interventions, helping guide allocation of limited resources by prioritising interventions that offer the most health for the least money. Although only one component of a more complex decision-making process, CEAs influence the distribution of health-care resources, directly influencing morbidity and mortality for the world's most vulnerable populations. However, CEA-associated measures are frequently setting-specific valuations, and CEA outcomes may violate ethical principles of equity and distributive justice. We examine the assumptions and analytical tools used in CEAs that may conflict with societal values. We then evaluate contextual features unique to resource-limited settings, including the source of health-state utilities and disability weights, implications of CEA thresholds in light of economic uncertainty, and the role of external donors. Finally, we explore opportunities to help align interpretation of CEA outcomes with values and budgetary constraints in resource-limited settings. The ethical implications of CEAs in resource-limited settings are vast. It is imperative that CEA outcome summary measures and implementation thresholds adequately reflect societal values and ethical priorities in resource-limited settings.

  10. Evaluating Changes in the Molecular Composition and Optical Properties of Pacific Ocean Dissolved Organic Matter (DOM) caused by Borodeuteride Reduction

    NASA Astrophysics Data System (ADS)

    Bianca, M.; Blough, N. V.; Del Vecchio, R.; Cartisano, C. M.; Schmitt-Kopplin, P.; Gonsior, M.

    2016-02-01

    Electrospray ionization Fourier transform ion cyclotron resonance mass spectrometry (ESI-FT-ICR MS) is a powerful tool to obtain detailed molecular information for complex DOM and was combined in this study with optical measurements to determine the molecular fingerprint of Pacific Ocean DOM before and after borodeuteride reduction. Selective chemical reductions, using sodium borodeuteride, has been previously demonstrated to produce unique mass markers of ketone and aldehyde-containing species in ultrahigh resolution mass spectrometry. These functional groups have also been proposed to be responsible for chromophoric dissolved organic matter (CDOM) long wavelength optical properties through charge transfer interactions and their chemical reduction has shown to irreversibly alter the CDOM optical properties. ESI-FT-ICR MS coupled with borodeuteride reduction was thus applied to reference material, Suwannee River Fulvic Acid (SRFA), and CDOM extracts collected from Station ALOHA, in the North Pacific Ocean during December 2014. Results showed distinct differences between samples collected at different depths, indicating that the combination of FT-ICR-MS with borodeuteride reduction is a useful analytical tool to further understand marine DOM molecular composition. When this method is combined with optical measurements, specific insights into the CDOM composition can also be obtained.

  11. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    PubMed

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  12. A novel analytical method for pharmaceutical polymorphs by terahertz spectroscopy and the optimization of crystal form at the discovery stage.

    PubMed

    Ikeda, Yukihiro; Ishihara, Yoko; Moriwaki, Toshiya; Kato, Eiji; Terada, Katsuhide

    2010-01-01

    A novel analytical method for the determination of pharmaceutical polymorphs was developed using terahertz spectroscopy. It was found out that each polymorph of a substance showed a specific terahertz absorption spectrum. In particular, analysis of the second derivative spectrum was enormously beneficial in the discrimination of closely related polymorphs that were difficult to discern by powder X-ray diffractometry. Crystal forms that were obtained by crystallization from various solvents and stored under various conditions were specifically characterized by the second derivative of each terahertz spectrum. Fractional polymorphic transformation for substances stored under stressed conditions was also identified by terahertz spectroscopy during solid-state stability test, but could not be detected by powder X-ray diffractometry. Since polymorphs could be characterized clearly by terahertz spectroscopy, further physicochemical studies could be conducted in a timely manner. The development form of compound examined was determined by the results of comprehensive physicochemical studies that included thermodynamic relationships, as well as chemical and physicochemical stability. In conclusion, terahertz spectroscopy, which has unique power in the elucidation of molecular interaction within a crystal lattice, can play more important role in physicochemical research. Terahertz spectroscopy has a great potential as a tool for polymorphic determination, particularly since the second derivative of the terahertz spectrum possesses high sensitivity for pharmaceutical polymorphs.

  13. Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.

    PubMed

    Chen, Tiffany J; Kotecha, Nikesh

    2014-01-01

    Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.

  14. Using Empirical Models for Communication Prediction of Spacecraft

    NASA Technical Reports Server (NTRS)

    Quasny, Todd

    2015-01-01

    A viable communication path to a spacecraft is vital for its successful operation. For human spaceflight, a reliable and predictable communication link between the spacecraft and the ground is essential not only for the safety of the vehicle and the success of the mission, but for the safety of the humans on board as well. However, analytical models of these communication links are challenged by unique characteristics of space and the vehicle itself. For example, effects of radio frequency during high energy solar events while traveling through a solar array of a spacecraft can be difficult to model, and thus to predict. This presentation covers the use of empirical methods of communication link predictions, using the International Space Station (ISS) and its associated historical data as the verification platform and test bed. These empirical methods can then be incorporated into communication prediction and automation tools for the ISS in order to better understand the quality of the communication path given a myriad of variables, including solar array positions, line of site to satellites, position of the sun, and other dynamic structures on the outside of the ISS. The image on the left below show the current analytical model of one of the communication systems on the ISS. The image on the right shows a rudimentary empirical model of the same system based on historical archived data from the ISS.

  15. Key statistical and analytical issues for evaluating treatment effects in periodontal research.

    PubMed

    Tu, Yu-Kang; Gilthorpe, Mark S

    2012-06-01

    Statistics is an indispensible tool for evaluating treatment effects in clinical research. Due to the complexities of periodontal disease progression and data collection, statistical analyses for periodontal research have been a great challenge for both clinicians and statisticians. The aim of this article is to provide an overview of several basic, but important, statistical issues related to the evaluation of treatment effects and to clarify some common statistical misconceptions. Some of these issues are general, concerning many disciplines, and some are unique to periodontal research. We first discuss several statistical concepts that have sometimes been overlooked or misunderstood by periodontal researchers. For instance, decisions about whether to use the t-test or analysis of covariance, or whether to use parametric tests such as the t-test or its non-parametric counterpart, the Mann-Whitney U-test, have perplexed many periodontal researchers. We also describe more advanced methodological issues that have sometimes been overlooked by researchers. For instance, the phenomenon of regression to the mean is a fundamental issue to be considered when evaluating treatment effects, and collinearity amongst covariates is a conundrum that must be resolved when explaining and predicting treatment effects. Quick and easy solutions to these methodological and analytical issues are not always available in the literature, and careful statistical thinking is paramount when conducting useful and meaningful research. © 2012 John Wiley & Sons A/S.

  16. Refraction-enhanced backlit imaging of axially symmetric inertial confinement fusion plasmas.

    PubMed

    Koch, Jeffrey A; Landen, Otto L; Suter, Laurence J; Masse, Laurent P; Clark, Daniel S; Ross, James S; Mackinnon, Andrew J; Meezan, Nathan B; Thomas, Cliff A; Ping, Yuan

    2013-05-20

    X-ray backlit radiographs of dense plasma shells can be significantly altered by refraction of x rays that would otherwise travel straight-ray paths, and this effect can be a powerful tool for diagnosing the spatial structure of the plasma being radiographed. We explore the conditions under which refraction effects may be observed, and we use analytical and numerical approaches to quantify these effects for one-dimensional radial opacity and density profiles characteristic of inertial-confinement fusion (ICF) implosions. We also show how analytical and numerical approaches allow approximate radial plasma opacity and density profiles to be inferred from point-projection refraction-enhanced radiography data. This imaging technique can provide unique data on electron density profiles in ICF plasmas that cannot be obtained using other techniques, and the uniform illumination provided by point-like x-ray backlighters eliminates a significant source of uncertainty in inferences of plasma opacity profiles from area-backlit pinhole imaging data when the backlight spatial profile cannot be independently characterized. The technique is particularly suited to in-flight radiography of imploding low-opacity shells surrounding hydrogen ice, because refraction is sensitive to the electron density of the hydrogen plasma even when it is invisible to absorption radiography. It may also provide an alternative approach to timing shockwaves created by the implosion drive, that are currently invisible to absorption radiography.

  17. Medical schools viewed from a political perspective: how political skills can improve education leadership.

    PubMed

    Nordquist, Jonas; Grigsby, R Kevin

    2011-12-01

    Political science offers a unique perspective from which to inform education leadership practice. This article views leadership in the health professions through the lens of political science research and offers suggestions for how theories derived from political science can be used to develop education leadership practice. Political science is rarely used in the health professions education literature. This article illuminates how this discipline can generate a more nuanced understanding of leadership in health professions education by offering a terminology, a conceptual framework and insights derived from more than 80 years of empirical work. Previous research supports the premise that successful leaders have a good understanding of political processes. Studies show current health professional education is characterised by the influence of interest groups. At the same time, the need for urgent reform of health professional education is evident. Terminology, concepts and analytical models from political science can be used to develop the political understanding of education leaders and to ultimately support the necessary changes. The analytical concepts of interest and power are applicable to current health professional education. The model presented - analysing the policy process - provides us with a tool to fine-tune our understanding of leadership challenges and hence to communicate, analyse and create strategies that allow health professional education to better meet tomorrow's challenges. © Blackwell Publishing Ltd 2011.

  18. A composite CBRN surveillance and testing service

    NASA Astrophysics Data System (ADS)

    Niemeyer, Debra M.

    2004-08-01

    The terrorist threat coupled with a global military mission necessitates quick and accurate identification of environmental hazards, and CBRN early warning. The Air Force Institute for Operational Health (AFIOH) provides fundamental support to protect personnel from and mitigate the effects of untoward hazards exposures. Sustaining healthy communities since 1955, the organizational charter is to enhance warfighter mission effectiveness, protect health, improve readiness and reduce costs, assess and manage risks to human heath and safety, operational performance and the environment. The AFIOH Surveillance Directorate provides forward deployed and reach-back surveillance, agent identification, and environ-mental regulatory compliance testing. Three unique laboratories process and analyze over two million environmental samples and clinical specimens per year, providing analytical chemistry, radiological assessment, and infectious disease testing, in addition to supporting Air Force and Department of Defense (DoD) clinical reference laboratory and force health protection testing. Each laboratory has an applied or investigational testing section where new technologies and techniques are evaluated, and expert consultative support to assist in technology assessments and test analyses. The Epidemiology Surveillance Laboratory and Analytical Chemistry Laboratory are critical assets of the Centers for Disease Control and Prevention (CDC) National Laboratory Response Network. Deployable assets provide direct support to the Combatant Commander and include the Air Force Radiological Assessment Team, and the Biological Augmentation Team. A diverse directorate, the synergistic CBRN response capabilities are a commander"s force protection tool, critical to maintaining combat power.

  19. Behavior-Based Budget Management Using Predictive Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Troy Hiltbrand

    Historically, the mechanisms to perform forecasting have primarily used two common factors as a basis for future predictions: time and money. While time and money are very important aspects of determining future budgetary spend patterns, organizations represent a complex system of unique individuals with a myriad of associated behaviors and all of these behaviors have bearing on how budget is utilized. When looking to forecasted budgets, it becomes a guessing game about how budget managers will behave under a given set of conditions. This becomes relatively messy when human nature is introduced, as different managers will react very differently undermore » similar circumstances. While one manager becomes ultra conservative during periods of financial austerity, another might be un-phased and continue to spend as they have in the past. Both might revert into a state of budgetary protectionism masking what is truly happening at a budget holder level, in order to keep as much budget and influence as possible while at the same time sacrificing the greater good of the organization. To more accurately predict future outcomes, the models should consider both time and money and other behavioral patterns that have been observed across the organization. The field of predictive analytics is poised to provide the tools and methodologies needed for organizations to do just this: capture and leverage behaviors of the past to predict the future.« less

  20. Two-dimensional wavelet transform feature extraction for porous silicon chemical sensors.

    PubMed

    Murguía, José S; Vergara, Alexander; Vargas-Olmos, Cecilia; Wong, Travis J; Fonollosa, Jordi; Huerta, Ramón

    2013-06-27

    Designing reliable, fast responding, highly sensitive, and low-power consuming chemo-sensory systems has long been a major goal in chemo-sensing. This goal, however, presents a difficult challenge because having a set of chemo-sensory detectors exhibiting all these aforementioned ideal conditions are still largely un-realizable to-date. This paper presents a unique perspective on capturing more in-depth insights into the physicochemical interactions of two distinct, selectively chemically modified porous silicon (pSi) film-based optical gas sensors by implementing an innovative, based on signal processing methodology, namely the two-dimensional discrete wavelet transform. Specifically, the method consists of using the two-dimensional discrete wavelet transform as a feature extraction method to capture the non-stationary behavior from the bi-dimensional pSi rugate sensor response. Utilizing a comprehensive set of measurements collected from each of the aforementioned optically based chemical sensors, we evaluate the significance of our approach on a complex, six-dimensional chemical analyte discrimination/quantification task problem. Due to the bi-dimensional aspects naturally governing the optical sensor response to chemical analytes, our findings provide evidence that the proposed feature extractor strategy may be a valuable tool to deepen our understanding of the performance of optically based chemical sensors as well as an important step toward attaining their implementation in more realistic chemo-sensing applications. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Statistically Qualified Neuro-Analytic system and Method for Process Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    1998-11-04

    An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less

  2. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  3. User`s and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less

  4. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  5. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  6. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  7. DCMS: A data analytics and management system for molecular simulation.

    PubMed

    Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni

    Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.

  8. Self-Catalyzing Chemiluminescence of Luminol-Diazonium Ion and Its Application for Catalyst-Free Hydrogen Peroxide Detection and Rat Arthritis Imaging.

    PubMed

    Zhao, Chunxin; Cui, Hongbo; Duan, Jing; Zhang, Shenghai; Lv, Jiagen

    2018-02-06

    We report the unique self-catalyzing chemiluminescence (CL) of luminol-diazonium ion (N 2 + -luminol) and its analytical potential. Visual CL emission was initially observed when N 2 + -luminol was subjected to alkaline aqueous H 2 O 2 without the aid of any catalysts. Further experimental investigations found peroxidase-like activity of N 2 + -luminol on the cleavage of H 2 O 2 into OH • radical. Together with other experimental evidence, the CL mechanism is suggested as the activation of N 2 + -luminol and its dediazotization product 3-hydroxyl luminol by OH • radical into corresponding intermediate radicals, and then further oxidation to excited-state 3-N 2 + -phthalic acid and 3-hydroxyphthalic acid, which finally produce 415 nm CL. The self-catalyzing CL of N 2 + -luminol provides us an opportunity to achieve the attractive catalyst-free CL detection of H 2 O 2 . Experiments demonstrated the 10 -8 M level detection sensitivity to H 2 O 2 as well as to glucose or uric acid if presubjected to glucose oxidase or uricase. With the exampled determination of serum glucose and uric acid, N 2 + -luminol shows its analytical potential for other analytes linking the production or consumption of H 2 O 2 . Under physiological condition, N 2 + -luminol exhibits highly selective and sensitive CL toward 1 O 2 among the common reactive oxygen species. This capacity supports the significant application of N 2 + -luminol for detecting 1 O 2 in live animals. By imaging the arthritis in LEW rats, N 2 + -luminol CL is demonstrated as a potential tool for mapping the inflammation-relevant biological events in a live body.

  9. 14 CFR 1214.810 - Integration of payloads.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... performing the following typical Spacelab-payload mission management functions: (1) Analytical design of the... integration of experiments into racks and/or onto pallets. (5) Provision of payload unique software for use...

  10. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  11. FDT 2.0: Improving scalability of the fuzzy decision tree induction tool - integrating database storage.

    PubMed

    Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W

    2014-12-01

    Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.

  12. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  13. Coping with Volume and Variety in Temporal Event Sequences: Strategies for Sharpening Analytic Focus.

    PubMed

    Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam

    2017-06-01

    The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.

  14. Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.

    PubMed

    Buske, Christine; Gerlai, Robert

    2014-08-30

    Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.

  15. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  16. Structuring modeling and simulation analysis for evacuation planning and operations.

    DOT National Transportation Integrated Search

    2009-06-01

    This document is intended to provide guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in evacuation planning and operations. It is often unclear what kind of analytical approach may be of most value, ...

  17. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    PubMed

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  18. Analyticity in Time and Smoothing Effect of Solutions to Nonlinear Schrödinger Equations

    NASA Astrophysics Data System (ADS)

    Hayashi, Nakao; Kato, Keiichi

    In this paper we consider analyticity in time and smoothing effect of solutions to nonlinear Schrödinger equations where . We prove that if φ satisfies then there exists a unique solution of (1) and positive constants T, C0, C1 such that is analytic in time and space variables for and and has an analytic continuation on and In the case the condition (2) can be relaxed as follows: where m= 0 if n= 1, p= 1, m= 1 if n= 2, and m= 1 if n= 3, p= 1.

  19. The link between employee attitudes and employee effectiveness: Data matrix of meta-analytic estimates based on 1161 unique correlations.

    PubMed

    Mackay, Michael M

    2016-09-01

    This article offers a correlation matrix of meta-analytic estimates between various employee job attitudes (i.e., Employee engagement, job satisfaction, job involvement, and organizational commitment) and indicators of employee effectiveness (i.e., Focal performance, contextual performance, turnover intention, and absenteeism). The meta-analytic correlations in the matrix are based on over 1100 individual studies representing over 340,000 employees. Data was collected worldwide via employee self-report surveys. Structural path analyses based on the matrix, and the interpretation of the data, can be found in "Investigating the incremental validity of employee engagement in the prediction of employee effectiveness: a meta-analytic path analysis" (Mackay et al., 2016) [1].

  20. Analytical Tools Interface for Landscape Assessments

    EPA Science Inventory

    Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...

  1. SolarPILOT | Concentrating Solar Power | NREL

    Science.gov Websites

    tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is

  2. Brain Connectivity as a DNA Sequencing Problem

    NASA Astrophysics Data System (ADS)

    Zador, Anthony

    The mammalian cortex consists of millions or billions of neurons, each connected to thousands of other neurons. Traditional methods for determining the brain connectivity rely on microscopy to visualize neuronal connections, but such methods are slow, labor-intensive and often lack single neuron resolution. We have recently developed a new method, MAPseq, to recast the determination of brain wiring into a form that can exploit the tremendous recent advances in high-throughput DNA sequencing. DNA sequencing technology has outpaced even Moore's law, so that the cost of sequencing the human genome has dropped from a billion dollars in 2001 to below a thousand dollars today. MAPseq works by introducing random sequences of DNA-``barcodes''-to tag neurons uniquely. With MAPseq, we can determine the connectivity of over 50K single neurons in a single mouse cortex in about a week, an unprecedented throughput, ushering in the era of ``big data'' for brain wiring. We are now developing analytical tools and algorithms to make sense of these novel data sets.

  3. Ion Mobility Spectrometry-Mass Spectrometry Coupled with Gas-Phase Hydrogen/Deuterium Exchange for Metabolomics Analyses

    NASA Astrophysics Data System (ADS)

    Maleki, Hossein; Karanji, Ahmad K.; Majuta, Sandra; Maurer, Megan M.; Valentine, Stephen J.

    2018-02-01

    Ion mobility spectrometry-mass spectrometry (IMS-MS) in combination with gas-phase hydrogen/deuterium exchange (HDX) and collision-induced dissociation (CID) is evaluated as an analytical method for small-molecule standard and mixture characterization. Experiments show that compound ions exhibit unique HDX reactivities that can be used to distinguish different species. Additionally, it is shown that gas-phase HDX kinetics can be exploited to provide even further distinguishing capabilities by using different partial pressures of reagent gas. The relative HDX reactivity of a wide variety of molecules is discussed in light of the various molecular structures. Additionally, hydrogen accessibility scoring (HAS) and HDX kinetics modeling of candidate ( in silico) ion structures is utilized to estimate the relative ion conformer populations giving rise to specific HDX behavior. These data interpretation methods are discussed with a focus on developing predictive tools for HDX behavior. Finally, an example is provided in which ion mobility information is supplemented with HDX reactivity data to aid identification efforts of compounds in a metabolite extract.

  4. Dynamic Response of a Planetary Gear System Using a Finite Element/Contact Mechanics Model

    NASA Technical Reports Server (NTRS)

    Parker, Robert G.; Agashe, Vinayak; Vijayakar, Sandeep M.

    2000-01-01

    The dynamic response of a helicopter planetary gear system is examined over a wide range of operating speeds and torques. The analysis tool is a unique, semianalytical finite element formulation that admits precise representation of the tooth geometry and contact forces that are crucial in gear dynamics. Importantly, no a priori specification of static transmission error excitation or mesh frequency variation is required; the dynamic contact forces are evaluated internally at each time step. The calculated response shows classical resonances when a harmonic of mesh frequency coincides with a natural frequency. However, peculiar behavior occurs where resonances expected to be excited at a given speed are absent. This absence of particular modes is explained by analytical relationships that depend on the planetary configuration and mesh frequency harmonic. The torque sensitivity of the dynamic response is examined and compared to static analyses. Rotation mode response is shown to be more sensitive to input torque than translational mode response.

  5. Fluorescent Sensors Based on Aggregation-Induced Emission: Recent Advances and Perspectives.

    PubMed

    Gao, Meng; Tang, Ben Zhong

    2017-10-27

    Fluorescent sensors with advantages of excellent sensitivity, rapid response, and easy operation are emerging as powerful tools in environmental monitoring, biological research, and disease diagnosis. However, conventional fluorophores featured with π-planar structures usually suffer from serious self-quenching in the aggregated state, poor photostability, and small Stokes' shift. In contrast to conventional aggregation-caused quenching (ACQ) fluorophores, the newly emerged aggregation-induced emission fluorogens (AIEgens) are featured with high emission efficiency in the aggregated state, which provide unique opportunities for various sensing applications with advantages of high signal-to-noise ratio, strong photostability, and large Stokes' shift. In this review, we will first briefly give an introduction of the AIE concept and the turn-on sensing principles. Then, we will discuss the recent examples of AIE sensors according to types of analytes. Finally, we will give a perspective on the future developments of AIE sensors. We hope this review will inspire more endeavors to devote to this emerging world.

  6. Merging spatially variant physical process models under an optimized systems dynamics framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cain, William O.; Lowry, Thomas Stephen; Pierce, Suzanne A.

    The complexity of water resource issues, its interconnectedness to other systems, and the involvement of competing stakeholders often overwhelm decision-makers and inhibit the creation of clear management strategies. While a range of modeling tools and procedures exist to address these problems, they tend to be case specific and generally emphasize either a quantitative and overly analytic approach or present a qualitative dialogue-based approach lacking the ability to fully explore consequences of different policy decisions. The integration of these two approaches is needed to drive toward final decisions and engender effective outcomes. Given these limitations, the Computer Assisted Dispute Resolution systemmore » (CADRe) was developed to aid in stakeholder inclusive resource planning. This modeling and negotiation system uniquely addresses resource concerns by developing a spatially varying system dynamics model as well as innovative global optimization search techniques to maximize outcomes from participatory dialogues. Ultimately, the core system architecture of CADRe also serves as the cornerstone upon which key scientific innovation and challenges can be addressed.« less

  7. Preparation and characterization of solid lipid nanoparticles-a review.

    PubMed

    Parhi, Rabinarayan; Suresh, Padilama

    2012-03-01

    In the present scenario, most of the developed and new discovered drugs are posing real challenge to the formulation scientists due to their poor aqueous solubility which in turn is responsible for poor bioavailability. One of the approach to overcome above problem is the packaging of the drug in to particulate carrier system. Among various carriers, lipid emerged as very attractive candidate because of its unique property of enhancing the bioavailability of poorly water soluble drugs. Solid lipid, one of the physical forms of lipid, is used to formulate nanoparticles, popularly known as Solid lipid nanoparticles (SLNs), as an alternative carrier system to emulsions, liposomes and polymeric micro- and nano-particles. SLNs combine advantages of the traditional systems but avoid some of their major disadvantages. This paper reviews numerous production techniques for SLNs along with their advantages and disadvantages. Special attention is paid to the characterization of the SLNs by using various analytical tools. It also emphasizes on physical state of lipid (supercooled melts, different lipid modifications).

  8. The development and testing of the Lens Antenna Deployment Demonstration (LADD) test article

    NASA Technical Reports Server (NTRS)

    Pugh, Mark L.; Denton, Robert J., Jr.; Strange, Timothy J.

    1993-01-01

    The USAF Rome Laboratory and NASA Marshall Space Flight Center, through contract to Grumman Corporation, have developed a space-qualifiable test article for the Strategic Defense Initiative Organization to demonstrate the critical structural and mechanical elements of single-axis roll-out membrane deployment for Space Based Radar (SBR) applications. The Lens Antenna Deployment Demonstration (LADD) test article, originally designed as a shuttle-attached flight experiment, is a large precision space structure which is representative of operational designs for space-fed lens antennas. Although the flight experiment was cancelled due to funding constraints and major revisions in the Strategic Defense System (SDS) architecture, development of this test article was completed in June 1989. To take full advantage of the existence of this unique structure, a series of ground tests are proposed which include static, dynamic, and thermal measurements in a simulated space environment. An equally important objective of these tests is the verification of the analytical tools used to design and develop large precision space structures.

  9. 2D FT-ICR MS of Calmodulin: A Top-Down and Bottom-Up Approach.

    PubMed

    Floris, Federico; van Agthoven, Maria; Chiron, Lionel; Soulby, Andrew J; Wootton, Christopher A; Lam, Yuko P Y; Barrow, Mark P; Delsuc, Marc-André; O'Connor, Peter B

    2016-09-01

    Two-dimensional Fourier transform ion cyclotron resonance mass spectrometry (2D FT-ICR MS) allows data-independent fragmentation of all ions in a sample and correlation of fragment ions to their precursors through the modulation of precursor ion cyclotron radii prior to fragmentation. Previous results show that implementation of 2D FT-ICR MS with infrared multi-photon dissociation (IRMPD) and electron capture dissociation (ECD) has turned this method into a useful analytical tool. In this work, IRMPD tandem mass spectrometry of calmodulin (CaM) has been performed both in one-dimensional and two-dimensional FT-ICR MS using a top-down and bottom-up approach. 2D IRMPD FT-ICR MS is used to achieve extensive inter-residue bond cleavage and assignment for CaM, using its unique features for fragment identification in a less time- and sample-consuming experiment than doing the same thing using sequential MS/MS experiments. Graphical Abstract ᅟ.

  10. The response of vegetation to geochemical conditions

    NASA Technical Reports Server (NTRS)

    Mouat, D. A.

    1983-01-01

    An understanding of the factors of vegetation response to changes in the geochemistry of the environment may give exploration geologists and other researchers an additional and effective tool for rock type discrimination. The factors of vegetation response can be grouped into three principal categories: structural or morphological factors, taxonomic factors which include indicator flora as well as vegetation assemblages, and spectral factors which represent the manner in which the vegetation interacts with electromagnetic radiation. The response of these factors over areas of anomalous mineralization is often unique and may be due to nutrient deficiencies and/or imbalances, toxicity and stress caused by anomalous mineral concentrations in the soil, low water retention, and plant competition. The successful use of geobotanical techniques results from the integration of the geobotanical observations with other techniques. The use of remote sensing in such a program must be predicated on those factors which can be discriminated within the constraints of the spatial, spectral, radiometric, and temporal resolutions of the sensing system and with appropriate analytical techniques.

  11. Assessment of transcutaneous vaccine delivery by optical coherence tomography Assessment of transcutaneous vaccine delivery by OCT

    NASA Astrophysics Data System (ADS)

    Kamali, T.; Doronin, A.; Rattanapak, T.; Hook, S.; Meglinski, I.

    2012-08-01

    Immunization is one of the most efficient and cost-effective means for the prevention of diseases. The latest trend for inducing protective immunity is topical application of vaccines to intact skin rather than invasive administration via injection. Apart from being a non-invasive route of drug delivery, skin itself also offers advantages through the presence of cells of the immune system in both the dermis and epidermis. However, vaccine penetration through the outermost layers of skin is limited by the barrier provided by the Stratum corneum. In the current study utilizing conventional Optical Coherence Tomography (OCT) we investigate the transcutaneous delivery of a nano- particulate peptide vaccine into mouse skin in vivo. We demonstrate that a front of molecular diffusion within the skin can be clearly observed by using cross-correlations of successive 2D OCT images. Thus, OCT provides a unique tool for quantitative assessment of dynamics of diffusion of drugs, target compounds, analytes, cosmetics and various chemical agents in biological tissues in vivo.

  12. Hydrodynamics on Supercomputers: Interacting Binary Stars

    NASA Astrophysics Data System (ADS)

    Blondin, J. M.

    1997-05-01

    The interaction of close binary stars accounts for a wide variety of peculiar objects scattered throughout our Galaxy. The unique features of Algols, Symbiotics, X-ray binaries, cataclysmic variables and many others are linked to the dynamics of the circumstellar gas which can take forms from tidal streams and accretion disks to colliding stellar winds. As in many other areas of astrophysics, large scale computing has provided a powerful new tool in the study of interacting binaries. In the research to be described, hydrodynamic simulations are used to create a "laboratory", within which one can "experiment": change the system and observe (and predict) the effects of those changes. This type of numerical experimentation, when buttressed by analytic studies, provides a means of interpreting observations, identifying and understanding the relevant physics, and visualizing the physical system. The results of such experiments will be shown, including the structure of tidal streams in Roche lobe overflow systems, mass accretion in X-ray binaries, and the formation of accretion disks.

  13. ExoMars Raman laser spectrometer for Exomars

    NASA Astrophysics Data System (ADS)

    Rull, F.; Sansano, A.; Díaz, E.; Canora, C. P.; Moral, A. G.; Tato, C.; Colombo, M.; Belenguer, T.; Fernández, M.; Manfredi, J. A. R.; Canchal, R.; Dávila, B.; Jiménez, A.; Gallego, P.; Ibarmia, S.; Prieto, J. A. R.; Santiago, A.; Pla, J.; Ramos, G.; Díaz, C.; González, C.

    2011-10-01

    The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Programme, ExoMars mission. ExoMars 2018 main Scientific objective is "Searching for evidence of past and present life on Mars". Raman Spectroscopy is used to analyze the vibrational modes of a substance either in the solid, liquid or gas state. It relies on the inelastic scattering (Raman Scattering) of monochromatic light produced by atoms and molecules. The radiation-matter interaction results in the energy of the exciting photons to be shifted up or down. The shift in energy appears as a spectral distribution and therefore provides an unique fingerprint by which the substances can be identified and structurally analyzed. The RLS is being developed by an European Consortium composed by Spanish, French, German and UK partners. It will perform Raman spectroscopy on crushed powdered samples inside the Rover's Analytical Laboratory Drawer. Instrument performances are being evaluated by means of simulation tools and development of an instrument prototype.

  14. Impact of non-ideal analyte behavior on the separation of protein aggregates by asymmetric flow field-flow fractionation.

    PubMed

    Boll, Björn; Josse, Lena; Heubach, Anja; Hochenauer, Sophie; Finkler, Christof; Huwyler, Jörg; Koulov, Atanas V

    2018-04-25

    Asymmetric flow field-flow fractionation is a valuable tool for the characterization of protein aggregates in biotechnology owing to its broad size range and unique separation principle. However, in practice asymmetric flow field-flow fractionation is non-trivial to use due to the major deviations from theory and the influence on separation by various factors that are not fully understood. Here we report methods to assess the non-ideal effects that influence asymmetric flow field-flow fractionation separation and for the first time identify experimentally the main factors that impact it. Furthermore, we propose new approaches to minimize such non-ideal behavior, showing that by adjusting the mobile phase composition (pH and ionic strength) the resolution of asymmetric flow field-flow fractionation separation can be drastically improved. Additionally, we propose a best practice method for new proteins. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Clonality of bacterial consortia in root canals and subjacent gingival crevices.

    PubMed

    Parahitiyawa, Nipuna B; Chu, Frederick C S; Leung, Wai K; Yam, Wing C; Jin, Li Jian; Samaranayake, Lakshman P

    2015-02-01

    No oral niche can be considered to be segregated from the subjacent milieu because of the complex community behavior and nature of the oral biofilms. The aim of this study was to address the paucity of information on how these species are clonally related to the subjacent gingival crevice bacteria. We utilized a metagenomic approach of amplifying 16S rDNA from genomic DNA, cloning, sequencing and analysis using LIBSHUFF software to assess the genetic homogeneity of the bacterial species from two infected root canals and subjacent gingival crevices. The four niches studied yielded 186 clones representing 54 phylotypes. Clone library comparisons using LIBSHUFF software indicated that each niche was inhabited by a unique flora. Further, 42% of the clones were of hitherto unknown phylotypes indicating the extent of bacterial diversity, especially in infected root canals and subjacent gingival crevices. We believe data generated through this novel analytical tool shed new light on understanding oral microbial ecosystems. © 2014 Wiley Publishing Asia Pty Ltd.

  16. Comparison of methods for measurement of organic compounds at ultra-trace level: analytical criteria and application to analysis of amino acids in extraterrestrial samples.

    PubMed

    Vandenabeele-Trambouze, O; Claeys-Bruno, M; Dobrijevic, M; Rodier, C; Borruat, G; Commeyras, A; Garrelly, L

    2005-02-01

    The need for criteria to compare different analytical methods for measuring extraterrestrial organic matter at ultra-trace levels in relatively small and unique samples (e.g., fragments of meteorites, micrometeorites, planetary samples) is discussed. We emphasize the need to standardize the description of future analyses, and take the first step toward a proposed international laboratory network for performance testing.

  17. Review: To be or not to be an identifiable model. Is this a relevant question in animal science modelling?

    PubMed

    Muñoz-Tamayo, R; Puillet, L; Daniel, J B; Sauvant, D; Martin, O; Taghipoor, M; Blavy, P

    2018-04-01

    What is a good (useful) mathematical model in animal science? For models constructed for prediction purposes, the question of model adequacy (usefulness) has been traditionally tackled by statistical analysis applied to observed experimental data relative to model-predicted variables. However, little attention has been paid to analytic tools that exploit the mathematical properties of the model equations. For example, in the context of model calibration, before attempting a numerical estimation of the model parameters, we might want to know if we have any chance of success in estimating a unique best value of the model parameters from available measurements. This question of uniqueness is referred to as structural identifiability; a mathematical property that is defined on the sole basis of the model structure within a hypothetical ideal experiment determined by a setting of model inputs (stimuli) and observable variables (measurements). Structural identifiability analysis applied to dynamic models described by ordinary differential equations (ODEs) is a common practice in control engineering and system identification. This analysis demands mathematical technicalities that are beyond the academic background of animal science, which might explain the lack of pervasiveness of identifiability analysis in animal science modelling. To fill this gap, in this paper we address the analysis of structural identifiability from a practitioner perspective by capitalizing on the use of dedicated software tools. Our objectives are (i) to provide a comprehensive explanation of the structural identifiability notion for the community of animal science modelling, (ii) to assess the relevance of identifiability analysis in animal science modelling and (iii) to motivate the community to use identifiability analysis in the modelling practice (when the identifiability question is relevant). We focus our study on ODE models. By using illustrative examples that include published mathematical models describing lactation in cattle, we show how structural identifiability analysis can contribute to advancing mathematical modelling in animal science towards the production of useful models and, moreover, highly informative experiments via optimal experiment design. Rather than attempting to impose a systematic identifiability analysis to the modelling community during model developments, we wish to open a window towards the discovery of a powerful tool for model construction and experiment design.

  18. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  19. Social media in the emergency medicine residency curriculum: social media responses to the residents' perspective article.

    PubMed

    Hayes, Bryan D; Kobner, Scott; Trueger, N Seth; Yiu, Stella; Lin, Michelle

    2015-05-01

    In July to August 2014, Annals of Emergency Medicine continued a collaboration with an academic Web site, Academic Life in Emergency Medicine (ALiEM), to host an online discussion session featuring the 2014 Annals Residents' Perspective article "Integration of Social Media in Emergency Medicine Residency Curriculum" by Scott et al. The objective was to describe a 14-day worldwide clinician dialogue about evidence, opinions, and early relevant innovations revolving around the featured article and made possible by the immediacy of social media technologies. Six online facilitators hosted the multimodal discussion on the ALiEM Web site, Twitter, and YouTube, which featured 3 preselected questions. Engagement was tracked through various Web analytic tools, and themes were identified by content curation. The dialogue resulted in 1,222 unique page views from 325 cities in 32 countries on the ALiEM Web site, 569,403 Twitter impressions, and 120 views of the video interview with the authors. Five major themes we identified in the discussion included curriculum design, pedagogy, and learning theory; digital curation skills of the 21st-century emergency medicine practitioner; engagement challenges; proposed solutions; and best practice examples. The immediacy of social media technologies provides clinicians the unique opportunity to engage a worldwide audience within a relatively short time frame. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  20. Next-generation technologies for spatial proteomics: Integrating ultra-high speed MALDI-TOF and high mass resolution MALDI FTICR imaging mass spectrometry for protein analysis.

    PubMed

    Spraggins, Jeffrey M; Rizzo, David G; Moore, Jessica L; Noto, Michael J; Skaar, Eric P; Caprioli, Richard M

    2016-06-01

    MALDI imaging mass spectrometry is a powerful analytical tool enabling the visualization of biomolecules in tissue. However, there are unique challenges associated with protein imaging experiments including the need for higher spatial resolution capabilities, improved image acquisition rates, and better molecular specificity. Here we demonstrate the capabilities of ultra-high speed MALDI-TOF and high mass resolution MALDI FTICR IMS platforms as they relate to these challenges. High spatial resolution MALDI-TOF protein images of rat brain tissue and cystic fibrosis lung tissue were acquired at image acquisition rates >25 pixels/s. Structures as small as 50 μm were spatially resolved and proteins associated with host immune response were observed in cystic fibrosis lung tissue. Ultra-high speed MALDI-TOF enables unique applications including megapixel molecular imaging as demonstrated for lipid analysis of cystic fibrosis lung tissue. Additionally, imaging experiments using MALDI FTICR IMS were shown to produce data with high mass accuracy (<5 ppm) and resolving power (∼75 000 at m/z 5000) for proteins up to ∼20 kDa. Analysis of clear cell renal cell carcinoma using MALDI FTICR IMS identified specific proteins localized to healthy tissue regions, within the tumor, and also in areas of increased vascularization around the tumor. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  2. Assessment of Multiple Solvents for Extraction and Direct GC-MS Determination of the Phytochemical Inventory of Sansevieria Extrafoliar Nectar Droplets.

    PubMed

    Gaylor, Michael O; Juntunen, Hope L; Hazelwood, Donna; Videau, Patrick

    2018-04-01

    Considerable effort has been devoted to analytical determinations of sugar and amino acid constituents of plant nectars, with the primary aim of understanding their ecological roles, yet few studies have reported more exhaustive organic compound inventories of plant nectars or extrafoliar nectars. This work evaluated the efficacy of four solvents (ethyl acetate, dichloromethane, toluene and hexane) to extract the greatest number of organic compound classes and unique compounds from extrafoliar nectar drops produced by Sansevieria spp. Aggregation of the results from each solvent revealed that 240 unique compounds were extracted in total, with 42.5% of those detected in multiple extracts. Aliphatic hydrocarbons dominated in all but the ethyl acetate extracts, with 44 unique aliphatic hydrocarbons detected in dichloromethane (DCM) extracts, followed by 41, 19 and 8 in hexane, toluene and ethyl acetate extracts, respectively. Hexane extracted the most unique compounds (79), followed by DCM (73), ethyl acetate (56) and toluene (32). Integrated total ion chromatographic peak areas of extracted compound classes were positively correlated with numbers of unique compounds detected within those classes. In addition to demonstrating that multi-solvent extraction with direct GC-MS detection is a suitable analytical approach for determining secondary nectar constituents, to the best of our knowledge, this study also represents: (i) the first attempt to inventory the secondary phytochemical constituents of Sansevieria spp. extrafoliar nectar secretions and (ii) the largest organic solvent extractable compound inventory reported for any plant matrix to date.

  3. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayden, D. W.

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried tomore » develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of Hatler et. al., and the reactive temperature rise will be obtained from Mader's work. Finally, the assessment of when a detonation occurs will be derived from Bowden and Yoffe's thermal explosion theory (hot spot).« less

  4. Bio-TDS: bioscience query tool discovery system.

    PubMed

    Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M

    2017-01-04

    Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  6. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    PubMed

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  7. Data and Tools | Concentrating Solar Power | NREL

    Science.gov Websites

    download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and

  8. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  9. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  10. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Astrophysics Data System (ADS)

    Jaggi, S.

    1993-02-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  11. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  12. Advancements in nano-enabled therapeutics for neuroHIV management.

    PubMed

    Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan

    This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.

  13. Existence of a coupled system of fractional differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Rabha W.; Siri, Zailan

    2015-10-22

    We manage the existence and uniqueness of a fractional coupled system containing Schrödinger equations. Such a system appears in quantum mechanics. We confirm that the fractional system under consideration admits a global solution in appropriate functional spaces. The solution is shown to be unique. The method is based on analytic technique of the fixed point theory. The fractional differential operator is considered from the virtue of the Riemann-Liouville differential operator.

  14. Analyte species and concentration identification using differentially functionalized microcantilever arrays and artificial neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senesac, Larry R; Datskos, Panos G; Sepaniak, Michael J

    2006-01-01

    In the present work, we have performed analyte species and concentration identification using an array of ten differentially functionalized microcantilevers coupled with a back-propagation artificial neural network pattern recognition algorithm. The array consists of ten nanostructured silicon microcantilevers functionalized by polymeric and gas chromatography phases and macrocyclic receptors as spatially dense, differentially responding sensing layers for identification and quantitation of individual analyte(s) and their binary mixtures. The array response (i.e. cantilever bending) to analyte vapor was measured by an optical readout scheme and the responses were recorded for a selection of individual analytes as well as several binary mixtures. Anmore » artificial neural network (ANN) was designed and trained to recognize not only the individual analytes and binary mixtures, but also to determine the concentration of individual components in a mixture. To the best of our knowledge, ANNs have not been applied to microcantilever array responses previously to determine concentrations of individual analytes. The trained ANN correctly identified the eleven test analyte(s) as individual components, most with probabilities greater than 97%, whereas it did not misidentify an unknown (untrained) analyte. Demonstrated unique aspects of this work include an ability to measure binary mixtures and provide both qualitative (identification) and quantitative (concentration) information with array-ANN-based sensor methodologies.« less

  15. Removal of uranium from soil samples for ICP-OES analysis of RCRA metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wero, M.; Lederer-Cano, A.; Billy, C.

    1995-12-01

    Soil samples containing high levels of uranium present unique analytical problems when analyzed for toxic metals (Ag, As, Ba, Cd, Cr, Cu, Ni, Pb, Se and Tl) because of the spectral interference of uranium in the ICP-OES emission spectrometer. Methods to remove uranium from the digestates of soil samples, known to be high in uranium, have been developed that reduce the initial uranium concentration (1-3%) to less than 500 ppm. UTEVA ion exchange columns, used as an ICP-OES analytical pre-treatment, reduces uranium to acceptable levels, permitting good analytical results of the RCRA metals by ICP-OES.

  16. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    EPA Science Inventory

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  17. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less

  18. Reducing the barriers against analytical epidemiological studies in investigations of local foodborne disease outbreaks in Germany - a starter kit for local health authorities.

    PubMed

    Werber, D; Bernard, H

    2014-02-27

    Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.

  19. Electrochemical Enzyme Biosensors Revisited: Old Solutions for New Problems.

    PubMed

    Monteiro, Tiago; Almeida, Maria Gabriela

    2018-05-14

    Worldwide legislation is driving the development of novel and highly efficient analytical tools for assessing the composition of every material that interacts with Consumers or Nature. The biosensor technology is one of the most active R&D domains of Analytical Sciences focused on the challenge of taking analytical chemistry to the field. Electrochemical biosensors based on redox enzymes, in particular, are highly appealing due to their usual quick response, high selectivity and sensitivity, low cost and portable dimensions. This review paper aims to provide an overview of the most important advances made in the field since the proposal of the first biosensor, the well-known hand-held glucose meter. The first section addresses the current needs and challenges for novel analytical tools, followed by a brief description of the different components and configurations of biosensing devices, and the fundamentals of enzyme kinetics and amperometry. The following sections emphasize on enzyme-based amperometric biosensors and the different stages of their development.

  20. Multivariable Hermite polynomials and phase-space dynamics

    NASA Technical Reports Server (NTRS)

    Dattoli, G.; Torre, Amalia; Lorenzutta, S.; Maino, G.; Chiccoli, C.

    1994-01-01

    The phase-space approach to classical and quantum systems demands for advanced analytical tools. Such an approach characterizes the evolution of a physical system through a set of variables, reducing to the canonically conjugate variables in the classical limit. It often happens that phase-space distributions can be written in terms of quadratic forms involving the above quoted variables. A significant analytical tool to treat these problems may come from the generalized many-variables Hermite polynomials, defined on quadratic forms in R(exp n). They form an orthonormal system in many dimensions and seem the natural tool to treat the harmonic oscillator dynamics in phase-space. In this contribution we discuss the properties of these polynomials and present some applications to physical problems.

  1. HIV diversity and drug resistance from plasma and non-plasma analytes in a large treatment programme in western Kenya.

    PubMed

    Kantor, Rami; DeLong, Allison; Balamane, Maya; Schreier, Leeann; Lloyd, Robert M; Injera, Wilfred; Kamle, Lydia; Mambo, Fidelis; Muyonga, Sarah; Katzenstein, David; Hogan, Joseph; Buziba, Nathan; Diero, Lameck

    2014-01-01

    Antiretroviral resistance leads to treatment failure and resistance transmission. Resistance data in western Kenya are limited. Collection of non-plasma analytes may provide additional resistance information. We assessed HIV diversity using the REGA tool, transmitted resistance by the WHO mutation list and acquired resistance upon first-line failure by the IAS-USA mutation list, at the Academic Model Providing Access to Healthcare (AMPATH), a major treatment programme in western Kenya. Plasma and four non-plasma analytes, dried blood-spots (DBS), dried plasma-spots (DPS), ViveST(TM)-plasma (STP) and ViveST-blood (STB), were compared to identify diversity and evaluate sequence concordance. Among 122 patients, 62 were treatment-naïve and 60 treatment-experienced; 61% were female, median age 35 years, median CD4 182 cells/µL, median viral-load 4.6 log10 copies/mL. One hundred and ninety-six sequences were available for 107/122 (88%) patients, 58/62 (94%) treatment-naïve and 49/60 (82%) treated; 100/122 (82%) plasma, 37/78 (47%) attempted DBS, 16/45 (36%) attempted DPS, 14/44 (32%) attempted STP from fresh plasma and 23/34 (68%) from frozen plasma, and 5/42 (12%) attempted STB. Plasma and DBS genotyping success increased at higher VL and shorter shipment-to-genotyping time. Main subtypes were A (62%), D (15%) and C (6%). Transmitted resistance was found in 1.8% of plasma sequences, and 7% combining analytes. Plasma resistance mutations were identified in 91% of treated patients, 76% NRTI, 91% NNRTI; 76% dual-class; 60% with intermediate-high predicted resistance to future treatment options; with novel mutation co-occurrence patterns. Nearly 88% of plasma mutations were identified in DBS, 89% in DPS and 94% in STP. Of 23 discordant mutations, 92% in plasma and 60% in non-plasma analytes were mixtures. Mean whole-sequence discordance from frozen plasma reference was 1.1% for plasma-DBS, 1.2% plasma-DPS, 2.0% plasma-STP and 2.3% plasma-STB. Of 23 plasma-STP discordances, one mutation was identified in plasma and 22 in STP (p<0.05). Discordance was inversely significantly related to VL for DBS. In a large treatment programme in western Kenya, we report high HIV-1 subtype diversity; low plasma transmitted resistance, increasing when multiple analytes were combined; and high-acquired resistance with unique mutation patterns. Resistance surveillance may be augmented by using non-plasma analytes for lower-cost genotyping in resource-limited settings.

  2. A unique linkage of administrative and clinical registry databases to expand analytic possibilities in pediatric heart transplantation research.

    PubMed

    Godown, Justin; Thurm, Cary; Dodd, Debra A; Soslow, Jonathan H; Feingold, Brian; Smith, Andrew H; Mettler, Bret A; Thompson, Bryn; Hall, Matt

    2017-12-01

    Large clinical, research, and administrative databases are increasingly utilized to facilitate pediatric heart transplant (HTx) research. Linking databases has proven to be a robust strategy across multiple disciplines to expand the possible analyses that can be performed while leveraging the strengths of each dataset. We describe a unique linkage of the Scientific Registry of Transplant Recipients (SRTR) database and the Pediatric Health Information System (PHIS) administrative database to provide a platform to assess resource utilization in pediatric HTx. All pediatric patients (1999-2016) who underwent HTx at a hospital enrolled in the PHIS database were identified. A linkage was performed between the SRTR and PHIS databases in a stepwise approach using indirect identifiers. To determine the feasibility of using these linked data to assess resource utilization, total and post-HTx hospital costs were assessed. A total of 3188 unique transplants were identified as being present in both databases and amenable to linkage. Linkage of SRTR and PHIS data was successful in 3057 (95.9%) patients, of whom 2896 (90.8%) had complete cost data. Median total and post-HTx hospital costs were $518,906 (IQR $324,199-$889,738), and $334,490 (IQR $235,506-$498,803) respectively with significant differences based on patient demographics and clinical characteristics at HTx. Linkage of the SRTR and PHIS databases is feasible and provides an invaluable tool to assess resource utilization. Our analysis provides contemporary cost data for pediatric HTx from the largest US sample reported to date. It also provides a platform for expanded analyses in the pediatric HTx population. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Sand moulds milling for one-of-a-kind pieces

    NASA Astrophysics Data System (ADS)

    Rodríguez, A.; Calleja, A.; Olvera, D.; Peñafiel, F. J.; López de Lacalle, L. N.

    2012-04-01

    Time to market is a critical measurement for today's foundry market. Combining 3D digitizing and sand blocks milling is possible to reduce this time. Avoiding the use of a wood pattern, this technique is useful for art pieces or unique parts, when only one component is necessary. The key of the proposed methodology is to achieve enough tool life with conventional tool qualities, avoiding the risk of sand destruction or damage. A special study of tool wear is presented in this work, studying different tool materials and different sand types. Two examples of unique parts are also presented in this work following the proposed methodology in order to reduce time and cost for the rapid reproduction of very short batches.

  4. High Performance Visualization using Query-Driven Visualizationand Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Campbell, Scott; Dart, Eli

    2006-06-15

    Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.

  5. Contribution of Electrochemistry to the Biomedical and Pharmaceutical Analytical Sciences.

    PubMed

    Kauffmann, Jean-Michel; Patris, Stephanie; Vandeput, Marie; Sarakbi, Ahmad; Sakira, Abdul Karim

    2016-01-01

    All analytical techniques have experienced major progress since the last ten years and electroanalysis is also involved in this trend. The unique characteristics of phenomena occurring at the electrode-solution interface along with the variety of electrochemical methods currently available allow for a broad spectrum of applications. Potentiometric, conductometric, voltammetric and amperometric methods are briefly reviewed with a critical view in terms of performance of the developed instrumentation with special emphasis on pharmaceutical and biomedical applications.

  6. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  7. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  8. Tool to Prioritize Energy Efficiency Investments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farese, P.; Gelman, R.; Hendron, R.

    2012-08-01

    To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.

  9. Tools for Educational Data Mining: A Review

    ERIC Educational Resources Information Center

    Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan

    2017-01-01

    In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…

  10. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Tengfang; Flapper, Joris; Ke, Jing

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  11. Predictive Data Tools Find Uses in Schools

    ERIC Educational Resources Information Center

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  12. A consumer guide: tools to manage vegetation and fuels.

    Treesearch

    David L. Peterson; Louisa Evers; Rebecca A. Gravenmier; Ellen Eberhardt

    2007-01-01

    Current efforts to improve the scientific basis for fire management on public lands will benefit from more efficient transfer of technical information and tools that support planning, implementation, and effectiveness of vegetation and hazardous fuel treatments. The technical scope, complexity, and relevant spatial scale of analytical and decision support tools differ...

  13. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  14. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    PubMed

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  15. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  16. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  17. Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice

    NASA Astrophysics Data System (ADS)

    Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.

    2013-10-01

    Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d

  18. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Constraint-Referenced Analytics of Algebra Learning

    ERIC Educational Resources Information Center

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  20. Towards an Analytic Foundation for Network Architecture

    DTIC Science & Technology

    2010-12-31

    SUPPLEMENTARY NOTES N/A 14. ABSTRACT In this project, we develop the analytic tools of stochastic optimization for wireless network design and apply them...and Mung Chiang, “ DaVinci : Dynamically Adaptive Virtual Networks for a Customized Internet,” in Proc. ACM SIGCOMM CoNext Conference, December 2008

  1. University Macro Analytic Simulation Model.

    ERIC Educational Resources Information Center

    Baron, Robert; Gulko, Warren

    The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…

  2. Perspectives on bay-delta science and policy

    USGS Publications Warehouse

    Healey, Michael; Dettinger, Michael; Norgaard, Richard

    2016-01-01

    The State of Bay–Delta Science 2008 highlighted seven emerging perspectives on science and management of the Delta. These perspectives had important effects on policy and legislation concerning management of the Delta ecosystem and water exports. From the collection of papers that make up the State of Bay–Delta Science 2016, we derive another seven perspectives that augment those published in 2008. The new perspectives address nutrient and contaminant concentrations in Delta waters, the failure of the Delta food web to support native species, the role of multiple stressors in driving species toward extinction, and the emerging importance of extreme events in driving change in the ecosystem and the water supply. The scientific advances that underpin these new perspectives were made possible by new measurement and analytic tools. We briefly discuss some of these, including miniaturized acoustic fish tags, sensors for monitoring of water quality, analytic techniques for disaggregating complex contaminant mixtures, remote sensing to assess levee vulnerability, and multidimensional hydrodynamic modeling. Despite these new tools and scientific insights, species conservation objectives for the Delta are not being met. We believe that this lack of progress stems in part from the fact that science and policy do not incorporate sufficiently long-term perspectives. Looking forward half a century was central to the Delta Visioning process, but science and policy have not embraced this conceptual breadth. We are also concerned that protection and enhancement of the unique cultural, recreational, natural resource, and agricultural values of the Delta as an evolving place, as required by the Delta Reform Act, has received no critical study and analysis. Adopting wider and longer science and policy perspectives immediately encourages recognition of the need for evaluation, analysis, and public discourse on novel conservation approaches. These longer and wider perspectives also encourage more attention to the opportunities provided by heavily invaded ecosystems. It is past time to turn scientific and policy attention to these issues.

  3. Towards an acoustic model-based poroelastic imaging method: I. Theoretical foundation.

    PubMed

    Berry, Gearóid P; Bamber, Jeffrey C; Armstrong, Cecil G; Miller, Naomi R; Barbone, Paul E

    2006-04-01

    The ultrasonic measurement and imaging of tissue elasticity is currently under wide investigation and development as a clinical tool for the assessment of a broad range of diseases, but little account in this field has yet been taken of the fact that soft tissue is porous and contains mobile fluid. The ability to squeeze fluid out of tissue may have implications for conventional elasticity imaging, and may present opportunities for new investigative tools. When a homogeneous, isotropic, fluid-saturated poroelastic material with a linearly elastic solid phase and incompressible solid and fluid constituents is subjected to stress, the behaviour of the induced internal strain field is influenced by three material constants: the Young's modulus (E(s)) and Poisson's ratio (nu(s)) of the solid matrix and the permeability (k) of the solid matrix to the pore fluid. New analytical expressions were derived and used to model the time-dependent behaviour of the strain field inside simulated homogeneous cylindrical samples of such a poroelastic material undergoing sustained unconfined compression. A model-based reconstruction technique was developed to produce images of parameters related to the poroelastic material constants (E(s), nu(s), k) from a comparison of the measured and predicted time-dependent spatially varying radial strain. Tests of the method using simulated noisy strain data showed that it is capable of producing three unique parametric images: an image of the Poisson's ratio of the solid matrix, an image of the axial strain (which was not time-dependent subsequent to the application of the compression) and an image representing the product of the aggregate modulus E(s)(1-nu(s))/(1+nu(s))(1-2nu(s)) of the solid matrix and the permeability of the solid matrix to the pore fluid. The analytical expressions were further used to numerically validate a finite element model and to clarify previous work on poroelastography.

  4. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  5. A portable fluorescent sensing system using multiple LEDs

    NASA Astrophysics Data System (ADS)

    Shin, Young-Ho; Barnett, Jonathan Z.; Gutierrez-Wing, M. Teresa; Rusch, Kelly A.; Choi, Jin-Woo

    2017-02-01

    This paper presents a portable fluorescent sensing system that utilizes different light emitting diode (LED) excitation lights for multiple target detection. In order to identify different analytes, three different wavelengths (385 nm, 448 nm, and 590 nm) of excitation light emitting diodes were used to selectively stimulate the target analytes. A highly sensitive silicon photomultiplier (SiPM) was used to detect corresponding fluorescent signals from each analyte. Based on the unique fluorescent response of each analyte, it is possible to simultaneously differentiate one analyte from the other in a mixture of target analytes. A portable system was designed and fabricated consisting of a display module, battery, data storage card, and sample loading tray into a compact 3D-printed jig. The portable sensor system was demonstrated for quantification and differentiation of microalgae (Chlorella vulgaris) and cyanobacteria (Spirulina) by measuring fluorescent responses of chlorophyll a in microalgae and phycocyanin in cyanobacteria. Obtained results suggest that the developed portable sensor system could be used as a generic fluorescence sensor platform for on-site detection of multiple analytes of interest.

  6. Initiating an Online Reputation Monitoring System with Open Source Analytics Tools

    NASA Astrophysics Data System (ADS)

    Shuhud, Mohd Ilias M.; Alwi, Najwa Hayaati Md; Halim, Azni Haslizan Abd

    2018-05-01

    Online reputation is an invaluable asset for modern organizations as it can help in business performance especially in sales and profit. However, if we are not aware of our reputation, it is difficult to maintain it. Thus, social media analytics is a new tool that can provide online reputation monitoring in various ways such as sentiment analysis. As a result, numerous large-scale organizations have implemented Online Reputation Monitoring (ORM) systems. However, this solution is not supposed to be exclusively for high-income organizations, as many organizations regardless sizes and types are now online. This research attempts to propose an affordable and reliable ORM system using combination of open source analytics tools for both novice practitioners and academicians. We also evaluate its prediction accuracy and we discovered that the system provides acceptable predictions (sixty percent accuracy) and demonstrate a tally prediction of major polarity by human annotation. The proposed system can help in supporting business decisions with flexible monitoring strategies especially for organization that want to initiate and administrate ORM themselves at low cost.

  7. Updates in metabolomics tools and resources: 2014-2015.

    PubMed

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  9. 3D FEM Simulation of Flank Wear in Turning

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio

    2011-05-01

    This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.

  10. EDCATS: An Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Heard, Pamala D.

    1998-01-01

    The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.

  11. The pathway not taken: understanding 'omics data in the perinatal context.

    PubMed

    Edlow, Andrea G; Slonim, Donna K; Wick, Heather C; Hui, Lisa; Bianchi, Diana W

    2015-07-01

    'Omics analysis of large datasets has an increasingly important role in perinatal research, but understanding gene expression analyses in the fetal context remains a challenge. We compared the interpretation provided by a widely used systems biology resource (ingenuity pathway analysis [IPA]) with that from gene set enrichment analysis (GSEA) with functional annotation curated specifically for the fetus (Developmental FunctionaL Annotation at Tufts [DFLAT]). Using amniotic fluid supernatant transcriptome datasets previously produced by our group, we analyzed 3 different developmental perturbations: aneuploidy (Trisomy 21 [T21]), hemodynamic (twin-twin transfusion syndrome [TTTS]), and metabolic (maternal obesity) vs sex- and gestational age-matched control subjects. Differentially expressed probe sets were identified with the use of paired t-tests with the Benjamini-Hochberg correction for multiple testing (P < .05). Functional analyses were performed with IPA and GSEA/DFLAT. Outputs were compared for biologic relevance to the fetus. Compared with control subjects, there were 414 significantly dysregulated probe sets in T21 fetuses, 2226 in TTTS recipient twins, and 470 in fetuses of obese women. Each analytic output was unique but complementary. For T21, both IPA and GSEA/DFLAT identified dysregulation of brain, cardiovascular, and integumentary system development. For TTTS, both analytic tools identified dysregulation of cell growth/proliferation, immune and inflammatory signaling, brain, and cardiovascular development. For maternal obesity, both tools identified dysregulation of immune and inflammatory signaling, brain and musculoskeletal development, and cell death. GSEA/DFLAT identified substantially more dysregulated biologic functions in fetuses of obese women (1203 vs 151). For all 3 datasets, GSEA/DFLAT provided more comprehensive information about brain development. IPA consistently provided more detailed annotation about cell death. IPA produced many dysregulated terms that pertained to cancer (14 in T21, 109 in TTTS, 26 in maternal obesity); GSEA/DFLAT did not. Interpretation of the fetal amniotic fluid supernatant transcriptome depends on the analytic program, which suggests that >1 resource should be used. Within IPA, physiologic cellular proliferation in the fetus produced many "false positive" annotations that pertained to cancer, which reflects its bias toward adult diseases. This study supports the use of gene annotation resources with a developmental focus, such as DFLAT, for 'omics studies in perinatal medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Unique determination of stratified steady water waves from pressure

    NASA Astrophysics Data System (ADS)

    Chen, Robin Ming; Walsh, Samuel

    2018-01-01

    Consider a two-dimensional stratified solitary wave propagating through a body of water that is bounded below by an impermeable ocean bed. In this work, we study how such a wave can be recovered from data consisting of the wave speed, upstream and downstream density and velocity profile, and the trace of the pressure on the bed. In particular, we prove that this data uniquely determines the wave, both in the (real) analytic and Sobolev regimes.

  13. The Miami Barrel: An Innovation in Forensic Firearms Identification

    ERIC Educational Resources Information Center

    Fadul, Thomas G., Jr.

    2009-01-01

    The scientific foundation in firearm and tool mark identification is that each firearm/tool produces a signature of identification (striation/impression) that is unique to that firearm/tool, and through examining the individual striations/impressions; the signature can be positively identified to the firearm/tool that produced it. There is no set…

  14. Stream Lifetimes Against Planetary Encounters

    NASA Technical Reports Server (NTRS)

    Valsecchi, G. B.; Lega, E.; Froeschle, Cl.

    2011-01-01

    We study, both analytically and numerically, the perturbation induced by an encounter with a planet on a meteoroid stream. Our analytical tool is the extension of pik s theory of close encounters, that we apply to streams described by geocentric variables. The resulting formulae are used to compute the rate at which a stream is dispersed by planetary encounters into the sporadic background. We have verified the accuracy of the analytical model using a numerical test.

  15. Micro- and nanofluidic systems in devices for biological, medical and environmental research

    NASA Astrophysics Data System (ADS)

    Evstrapov, A. A.

    2017-11-01

    The use of micro- and nanofluidic systems in modern analytical instruments allow you to implement a number of unique opportunities and achieve ultra-high measurement sensitivity. The possibility of manipulation of the individual biological objects (cells, bacteria, viruses, proteins, nucleic acids) in a liquid medium caused the development of devices on microchip platform for methods: chromatographic and electrophoretic analyzes; polymerase chain reaction; sequencing of nucleic acids; immunoassay; cytometric studies. Development of micro and nano fabrication technologies, materials science, surface chemistry, analytical chemistry, cell engineering have led to the creation of a unique systems such as “lab-on-a-chip”, “human-on-a-chip” and other. This article discusses common in microfluidics materials and methods of making functional structures. Examples of integration of nanoscale structures in microfluidic devices for the implementation of new features and improve the technical characteristics of devices and systems are shown.

  16. Comparative analytics of infusion pump data across multiple hospital systems.

    PubMed

    Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith

    2015-02-15

    A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  17. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  18. A GRAPHICAL DIAGNOSTIC METHOD FOR ASSESSING THE ROTATION IN FACTOR ANALYTICAL MODELS OF ATMOSPHERIC POLLUTION. (R831078)

    EPA Science Inventory

    Factor analytic tools such as principal component analysis (PCA) and positive matrix factorization (PMF), suffer from rotational ambiguity in the results: different solutions (factors) provide equally good fits to the measured data. The PMF model imposes non-negativity of both...

  19. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  20. Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.

    ERIC Educational Resources Information Center

    Kaya, Azmi

    1982-01-01

    Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…

  1. A thermal biosensor based on enzyme reaction.

    PubMed

    Zheng, Yi-Hua; Hua, Tse-Chao; Xu, Fei

    2005-01-01

    Application of the thermal biosensor as analytical tool is promising due to advantages as universal, simplicity and quick response. A novel thermal biosensor based on enzyme reaction has been developed. This biosensor is a flow injection analysis system and consists of two channels with enzyme reaction column and reference column. The reference column, which is set for eliminating the unspecific heat, is inactived on special enzyme reaction of the ingredient to be detected. The special enzyme reaction takes places in the enzyme reaction column at a constant temperature realizing by a thermoelectric thermostat. Thermal sensor based on the thermoelectric module containing 127 serial BiTe-thermocouples is used to monitor the temperature difference between two streams from the enzyme reaction column and the reference column. The analytical example for dichlorvos shows that this biosensor can be used as analytical tool in medicine and biology.

  2. Optical Drug Monitoring: Photoacoustic Imaging of Nanosensors to Monitor Therapeutic Lithium In Vivo

    PubMed Central

    Cash, Kevin J.; Li, Chiye; Xia, Jun; Wang, Lihong V.; Clark, Heather A.

    2015-01-01

    Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes. PMID:25588028

  3. Optical drug monitoring: photoacoustic imaging of nanosensors to monitor therapeutic lithium in vivo.

    PubMed

    Cash, Kevin J; Li, Chiye; Xia, Jun; Wang, Lihong V; Clark, Heather A

    2015-02-24

    Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal, we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes.

  4. Bright solitons in non-equilibrium coherent quantum matter

    PubMed Central

    Pinsker, F.; Flayac, H.

    2016-01-01

    We theoretically demonstrate a mechanism for bright soliton generation in spinor non-equilibrium Bose–Einstein condensates made of atoms or quasi-particles such as polaritons in semiconductor microcavities. We give analytical expressions for bright (half) solitons as minimizing functions of a generalized non-conservative Lagrangian elucidating the unique features of inter and intra-competition in non-equilibrium systems. The analytical results are supported by a detailed numerical analysis that further shows the rich soliton dynamics inferred by their instability and mutual cross-interactions. PMID:26997892

  5. Dual nozzle aerodynamic and cooling analysis study

    NASA Technical Reports Server (NTRS)

    Meagher, G. M.

    1981-01-01

    Analytical models to predict performance and operating characteristics of dual nozzle concepts were developed and improved. Aerodynamic models are available to define flow characteristics and bleed requirements for both the dual throat and dual expander concepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow, boundary layer, and shock effects within dual nozzle engines. Thermal analyses were performed to define cooling requirements for baseline configurations, and special studies of unique dual nozzle cooling problems defined feasible means of achieving adequate cooling.

  6. Improving a complex finite-difference ground water flow model through the use of an analytic element screening model

    USGS Publications Warehouse

    Hunt, R.J.; Anderson, M.P.; Kelson, V.A.

    1998-01-01

    This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.

  7. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  8. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2017-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  9. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2015-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  10. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  11. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  12. Analytical tools for identifying bicycle route suitability, coverage, and continuity.

    DOT National Transportation Integrated Search

    2012-05-01

    This report presents new tools created to assess bicycle suitability using geographic information systems (GIS). Bicycle suitability is a rating of how appropriate a roadway is for bicycle travel based on attributes of the roadway, such as vehi...

  13. Analytic regularization of uniform cubic B-spline deformation fields.

    PubMed

    Shackleford, James A; Yang, Qi; Lourenço, Ana M; Shusharina, Nadya; Kandasamy, Nagarajan; Sharp, Gregory C

    2012-01-01

    Image registration is inherently ill-posed, and lacks a unique solution. In the context of medical applications, it is desirable to avoid solutions that describe physically unsound deformations within the patient anatomy. Among the accepted methods of regularizing non-rigid image registration to provide solutions applicable to medical practice is the penalty of thin-plate bending energy. In this paper, we develop an exact, analytic method for computing the bending energy of a three-dimensional B-spline deformation field as a quadratic matrix operation on the spline coefficient values. Results presented on ten thoracic case studies indicate the analytic solution is between 61-1371x faster than a numerical central differencing solution.

  14. Resilience Simulation for Water, Power & Road Networks

    NASA Astrophysics Data System (ADS)

    Clark, S. S.; Seager, T. P.; Chester, M.; Eisenberg, D. A.; Sweet, D.; Linkov, I.

    2014-12-01

    The increasing frequency, scale, and damages associated with recent catastrophic events has called for a shift in focus from evading losses through risk analysis to improving threat preparation, planning, absorption, recovery, and adaptation through resilience. However, neither underlying theory nor analytic tools have kept pace with resilience rhetoric. As a consequence, current approaches to engineering resilience analysis often conflate resilience and robustness or collapse into a deeper commitment to the risk analytic paradigm proven problematic in the first place. This research seeks a generalizable understanding of resilience that is applicable in multiple disciplinary contexts. We adopt a unique investigative perspective by coupling social and technical analysis with human subjects research to discover the adaptive actions, ideas and decisions that contribute to resilience in three socio-technical infrastructure systems: electric power, water, and roadways. Our research integrates physical models representing network objects with examination of the knowledge systems and social interactions revealed by human subjects making decisions in a simulated crisis environment. To ensure a diversity of contexts, we model electric power, water, roadway and knowledge networks for Phoenix AZ and Indianapolis IN. We synthesize this in a new computer-based Resilient Infrastructure Simulation Environment (RISE) to allow individuals, groups (including students) and experts to test different network design configurations and crisis response approaches. By observing simulated failures and best performances, we expect a generalizable understanding of resilience may emerge that yields a measureable understanding of the sensing, anticipating, adapting, and learning processes that are essential to resilient organizations.

  15. Risk analytics for hedge funds

    NASA Astrophysics Data System (ADS)

    Pareek, Ankur

    2005-05-01

    The rapid growth of the hedge fund industry presents significant business opportunity for the institutional investors particularly in the form of portfolio diversification. To facilitate this, there is a need to develop a new set of risk analytics for investments consisting of hedge funds, with the ultimate aim to create transparency in risk measurement without compromising the proprietary investment strategies of hedge funds. As well documented in the literature, use of dynamic options like strategies by most of the hedge funds make their returns highly non-normal with fat tails and high kurtosis, thus rendering Value at Risk (VaR) and other mean-variance analysis methods unsuitable for hedge fund risk quantification. This paper looks at some unique concerns for hedge fund risk management and will particularly concentrate on two approaches from physical world to model the non-linearities and dynamic correlations in hedge fund portfolio returns: Self Organizing Criticality (SOC) and Random Matrix Theory (RMT).Random Matrix Theory analyzes correlation matrix between different hedge fund styles and filters random noise from genuine correlations arising from interactions within the system. As seen in the results of portfolio risk analysis, it leads to a better portfolio risk forecastability and thus to optimum allocation of resources to different hedge fund styles. The results also prove the efficacy of self-organized criticality and implied portfolio correlation as a tool for risk management and style selection for portfolios of hedge funds, being particularly effective during non-linear market crashes.

  16. Integration of Gas Chromatography Mass Spectrometry Methods for Differentiating Ricin Preparation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wunschel, David S.; Melville, Angela M.; Ehrhardt, Christopher J.

    2012-05-17

    The investigation of crimes involving chemical or biological agents is infrequent, but presents unique analytical challenges. The protein toxin ricin is encountered more frequently than other agents and is found in the seeds of the castor plant Ricinus communis. Typically, the toxin is extracted from castor seeds utilizing a variety of different recipes that result in varying purity of the toxin. Moreover, these various purification steps can also leave or differentially remove a variety of exogenous and endogenous residual components with the toxin that may indicate the type and number of purification steps involved. We have applied three gas chromatographicmore » - mass spectrometric (GC-MS) based analytical methods to measure the variation in seed carbohydrates and castor oil ricinoleic acid as well as the presence of solvents used for purification. These methods were applied to the same samples prepared using four previously identified toxin preparation methods starting from four varieties of castor seeds. The individual data sets for seed carbohydrate profiles, ricinoleic acid or acetone amount each provided information capable of differentiating different types of toxin preparations across seed types. However, the integration of the data sets using multivariate factor analysis provided a clear distinction of all samples based on the preparation method and independent of the seed source. In particular the abundance of mannose, arabinose, fucose, ricinoleic acid and acetone were shown to be important differentiating factors. These complementary tools provide a more confident determination of the method of toxin preparation.« less

  17. Dried haematic microsamples and LC-MS/MS for the analysis of natural and synthetic cannabinoids.

    PubMed

    Protti, Michele; Rudge, James; Sberna, Angelo Eliseo; Gerra, Gilberto; Mercolini, Laura

    2017-02-15

    Synthetic cannabinoids are new psychoactive substances (NPS) with similar effects when compared to natural ones found in Cannabis derivatives. They have rapidly integrated into the illicit market, often sold as alternatives under international control. The need to identify and quantify an unprecedented and growing number of new compounds represents a unique challenge for toxicological, forensic and anti-doping analysis. Dried blood spots have been used within the bioanalytical framework in place of plasma or serum, in order to reduce invasiveness, lower sample size, simplify handling, storage and shipping of samples and to facilitate home-based and on-field applications. However, DBS implementation has been limited mainly by concerns related to haematocrit effect on method accuracy. Volumetric absorptive microsampling (VAMS™), a second generation dried miniaturized sampling technology, has been developed just in order to eliminate haematocrit effect, thus providing accurate sampling but still granting feasible sample processing. An original LC-MS/MS method was herein developed and validated for the analysis of THC and its 2 main metabolites, together with 10 representative synthetic cannabinoids in both DBS and VAMS dried microsamples. The ultimate goal of this work is to provide highly innovative DBS and VAMS analytical protocols, whose performances were extensively optimized and compared, in order to provide effective and alternative tools that can be applied for natural and synthetic cannabinoid determination, in place of classical analytical strategies. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. New developments of X-ray fluorescence imaging techniques in laboratory

    NASA Astrophysics Data System (ADS)

    Tsuji, Kouichi; Matsuno, Tsuyoshi; Takimoto, Yuki; Yamanashi, Masaki; Kometani, Noritsugu; Sasaki, Yuji C.; Hasegawa, Takeshi; Kato, Shuichi; Yamada, Takashi; Shoji, Takashi; Kawahara, Naoki

    2015-11-01

    X-ray fluorescence (XRF) analysis is a well-established analytical technique with a long research history. Many applications have been reported in various fields, such as in the environmental, archeological, biological, and forensic sciences as well as in industry. This is because XRF has a unique advantage of being a nondestructive analytical tool with good precision for quantitative analysis. Recent advances in XRF analysis have been realized by the development of new x-ray optics and x-ray detectors. Advanced x-ray focusing optics enables the making of a micro x-ray beam, leading to micro-XRF analysis and XRF imaging. A confocal micro-XRF technique has been applied for the visualization of elemental distributions inside the samples. This technique was applied for liquid samples and for monitoring chemical reactions such as the metal corrosion of steel samples in the NaCl solutions. In addition, a principal component analysis was applied for reducing the background intensity in XRF spectra obtained during XRF mapping, leading to improved spatial resolution of confocal micro-XRF images. In parallel, the authors have proposed a wavelength dispersive XRF (WD-XRF) imaging spectrometer for a fast elemental imaging. A new two dimensional x-ray detector, the Pilatus detector was applied for WD-XRF imaging. Fast XRF imaging in 1 s or even less was demonstrated for Euro coins and industrial samples. In this review paper, these recent advances in laboratory-based XRF imaging, especially in a laboratory setting, will be introduced.

  19. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  20. Integration Process for Payloads in the Fluids and Combustion Facility

    NASA Technical Reports Server (NTRS)

    Free, James M.; Nall, Marsha M.

    2001-01-01

    The Fluids and Combustion Facility (FCF) is an ISS research facility located in the United States Laboratory (US Lab), Destiny. The FCF is a multi-discipline facility that performs microgravity research primarily in fluids physics science and combustion science. This facility remains on-orbit and provides accommodations to multi-user and Principal investigator (PI) unique hardware. The FCF is designed to accommodate 15 PI's per year. In order to allow for this number of payloads per year, the FCF has developed an end-to-end analytical and physical integration process. The process includes provision of integration tools, products and interface management throughout the life of the payload. The payload is provided with a single point of contact from the facility and works with that interface from PI selection through post flight processing. The process utilizes electronic tools for creation of interface documents/agreements, storage of payload data and rollup for facility submittals to ISS. Additionally, the process provides integration to and testing with flight-like simulators prior to payload delivery to KSC. These simulators allow the payload to test in the flight configuration and perform final facility interface and science verifications. The process also provides for support to the payload from the FCF through the Payload Safety Review Panel (PSRP). Finally, the process includes support in the development of operational products and the operation of the payload on-orbit.

  1. DataUp: Helping manage and archive data within the researcher's workflow

    NASA Astrophysics Data System (ADS)

    Strasser, C.

    2012-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are lacks of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. We have developed an open-source add-in for Excel and an open source web application intended to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. The researcher does not need a prior relationship with a data repository to use DataUp; the newly implemented ONEShare repository, a DataONE member node, is available for any researcher to archive and share their data. By meeting researchers where they already work, in spreadsheets, DataUp becomes part of the researcher's workflow and data management and sharing becomes easier. Future enhancement of DataUp will rely on members of the community adopting and adapting the DataUp tools to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between Microsoft Research Connections, the University of California's California Digital Library, the Gordon and Betty Moore Foundation, and DataONE.

  2. MAMA User Guide v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid

    Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less

  3. Laser induced breakdown spectroscopy (LIBS) as a rapid tool for material analysis

    NASA Astrophysics Data System (ADS)

    Hussain, T.; Gondal, M. A.

    2013-06-01

    Laser induced breakdown spectroscopy (LIBS) is a novel technique for elemental analysis based on laser-generated plasma. In this technique, laser pulses are applied for ablation of the sample, resulting in the vaporization and ionization of sample in hot plasma which is finally analyzed by the spectrometer. The elements are identified by their unique spectral signatures. LIBS system was developed for elemental analysis of solid and liquid samples. The developed system was applied for qualitative as well as quantitative measurement of elemental concentration present in iron slag and open pit ore samples. The plasma was generated by focusing a pulsed Nd:YAG laser at 1064 nm on test samples to study the capabilities of LIBS as a rapid tool for material analysis. The concentrations of various elements of environmental significance such as cadmium, calcium, magnesium, chromium, manganese, titanium, barium, phosphorus, copper, iron, zinc etc., in these samples were determined. Optimal experimental conditions were evaluated for improving the sensitivity of developed LIBS system through parametric dependence study. The laser-induced breakdown spectroscopy (LIBS) results were compared with the results obtained using standard analytical technique such as inductively couple plasma emission spectroscopy (ICP). Limit of detection (LOD) of our LIBS system were also estimated for the above mentioned elements. This study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag and open pit waste.

  4. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.

  5. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  6. An overview of city analytics

    PubMed Central

    Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter

    2017-01-01

    We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454

  7. TECRA Unique test for rapid detection of Salmonella in food: collaborative study.

    PubMed

    Hughes, D; Dailianis, A E; Hill, L; McIntyre, D A; Anderson, A

    2001-01-01

    The TECRA Unique Salmonella test uses the principle of immunoenrichment to allow rapid detection of Salmonellae in food. A collaborative study was conducted to compare the TECRA Salmonella Unique test with the reference culture method given in the U.S. Food and Drug Administration's Bacteriological Analytical Manual. Three food types (milk powder, pepper, and soy flour) were analyzed in Australia and 2 food types (milk chocolate and dried egg) were analyzed in the United States. Forty-one collaborators participated in the study. For each of the 5 foods at each of the 3 levels, a comparison showed no significant differences (p > or = 0.05) in the proportion of positive test samples for Unique and that for the reference method using the Chi-square test for independence with continuity correction.

  8. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Light aircraft crash safety program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.

    1974-01-01

    NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.

  10. Optimal rendezvous in the neighborhood of a circular orbit

    NASA Technical Reports Server (NTRS)

    Jones, J. B.

    1975-01-01

    The minimum velocity change rendezvous solutions, when the motion may be linearized about a circular orbit, fall into two separate regions; the phase-for-free region and the general region. Phase-for-free solutions are derived from the optimum transfer solutions, require the same velocity change expenditure, but may not be unique. Analytic solutions are presented in two of the three subregions. An algorithm is presented for determining the unique solutions in the general region. Various sources of initial conditions are discussed and three examples presented.

  11. Patent databases and analytical tools for space technology commercialization (Part 2)

    NASA Astrophysics Data System (ADS)

    Hulsey, William N., III

    2002-07-01

    A shift in the space industry has occurred that requires technology developers to understand the basics of the intellectual property laws; Global harmonization facilitates this understanding; internet-based tools enable knowledge of these rights and the facts affecting them.

  12. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  13. Integration of bus stop counts data with census data for improving bus service.

    DOT National Transportation Integrated Search

    2016-04-01

    This research project produced an open source transit market data visualization and analysis tool suite, : The Bus Transit Market Analyst (BTMA), which contains user-friendly GIS mapping and data : analytics tools, and state-of-the-art transit demand...

  14. A theoretical study on pure bending of hexagonal close-packed metal sheet

    NASA Astrophysics Data System (ADS)

    Mehrabi, Hamed; Yang, Chunhui

    2018-05-01

    Hexagonal close-packed (HCP) metals have quite different mechanical behaviours in comparison to conventional cubic metals such as steels and aluminum alloys [1, 2]. They exhibit a significant tension-compression asymmetry in initial yielding and subsequent plastic hardening. The reason for this unique behaviour can be attributed to their limited symmetric crystal structure, which leads to twining deformation [3-5]. This unique behaviour strongly influences sheet metal forming of such metals, especially for roll forming, in which the bending is dominant. Hence, it is crucial to represent constitutive relations of HCP metals for accurate estimation of bending moment-curvature behaviours. In this paper, an analytical model for asymmetric elastoplastic pure bending with an application of Cazacu-Barlat asymmetric yield function [6] is presented. This yield function considers the asymmetrical tension-compression behaviour of HCP metals by using second and third invariants of the stress deviator tensor and a specified constant, which can be expressed in terms of uniaxial yield stresses in tension and compression. As a case study, the analytical model is applied to predict the moment-curvature behaviours of AZ31B magnesium alloy sheets under uniaxial loading condition. Furthermore, the analytical model is implemented as a user-defined material through the UMAT interface in Abaqus [7, 8] for conducting pure bending simulations. The results show that the analytical model can reasonably capture the asymmetric tension-compression behaviour of the magnesium alloy. The predicted moment-curvature behaviour has good agreement with the experimental results. Furthermore, numerical results show a better accuracy by the application of the Cazacu-Barlat yield function than those using the von-Mises yield function, which are more conservative than analytical results.

  15. Analytic Result for the Two-loop Six-point NMHV Amplitude in N = 4 Super Yang-Mills Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, Lance J.; /SLAC; Drummond, James M.

    2012-02-15

    We provide a simple analytic formula for the two-loop six-point ratio function of planar N = 4 super Yang-Mills theory. This result extends the analytic knowledge of multi-loop six-point amplitudes beyond those with maximal helicity violation. We make a natural ansatz for the symbols of the relevant functions appearing in the two-loop amplitude, and impose various consistency conditions, including symmetry, the absence of spurious poles, the correct collinear behavior, and agreement with the operator product expansion for light-like (super) Wilson loops. This information reduces the ansatz to a small number of relatively simple functions. In order to fix these parametersmore » uniquely, we utilize an explicit representation of the amplitude in terms of loop integrals that can be evaluated analytically in various kinematic limits. The final compact analytic result is expressed in terms of classical polylogarithms, whose arguments are rational functions of the dual conformal cross-ratios, plus precisely two functions that are not of this type. One of the functions, the loop integral {Omega}{sup (2)}, also plays a key role in a new representation of the remainder function R{sub 6}{sup (2)} in the maximally helicity violating sector. Another interesting feature at two loops is the appearance of a new (parity odd) x (parity odd) sector of the amplitude, which is absent at one loop, and which is uniquely determined in a natural way in terms of the more familiar (parity even) x (parity even) part. The second non-polylogarithmic function, the loop integral {tilde {Omega}}{sup (2)}, characterizes this sector. Both {Omega}{sup (2)} and {tilde {Omega}}{sup (2)} can be expressed as one-dimensional integrals over classical polylogarithms with rational arguments.« less

  16. Unique Outcomes in the Narratives of Young Adults Who Experienced Dating Violence as Adolescents.

    PubMed

    Draucker, Claire Burke; Smith, Carolyn; Mazurczyk, Jill; Thomas, Destini; Ramirez, Patricia; McNealy, Kim; Thomas, Jade; Martsolf, Donna S

    2016-01-01

    Narrative therapy, an approach based on the reauthoring of life narratives, may be a useful psychotherapeutic strategy for youth who have experienced dating violence. A cornerstone of narrative therapy is the concept of unique outcomes, which are moments that stand in contrast to a client's otherwise problem-saturated narratives. The purpose of this study was to identify and categorize unique outcomes embedded in narratives about adolescent dating violence. Text units representing unique outcomes were extracted from transcripts of interviews with 88 young adults who had experienced dating violence and were categorized using standard content analytic techniques. Six categories of unique outcome stories were identified: facing-facts stories, standing-up-for-myself stories, cutting-it-off stories, cutting-'em-loose stories, getting-back-on-track stories, and changing-it-up stories. This typology of unique outcomes can inform clinicians who work with clients who have a history of adolescent dating violence. © The Author(s) 2015.

  17. Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention

    ERIC Educational Resources Information Center

    West, Deborah; Heath, David; Huijser, Henk

    2016-01-01

    This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…

  18. Making Sense of Game-Based User Data: Learning Analytics in Applied Games

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Kickmeier-Rus, Michael D.; Albert, Dietrich

    2015-01-01

    Digital learning games are useful educational tools with high motivational potential. With the application of games for instruction there comes the need of acknowledging learning game experiences also in the context of educational assessment. Learning analytics provides new opportunities for supporting assessment in and of educational games. We…

  19. Gear Spline Coupling Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Yi; Errichello, Robert

    2013-08-29

    An analytical model is developed to evaluate the design of a spline coupling. For a given torque and shaft misalignment, the model calculates the number of teeth in contact, tooth loads, stiffnesses, stresses, and safety factors. The analytic model provides essential spline coupling design and modeling information and could be easily integrated into gearbox design and simulation tools.

  20. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

Top