Science.gov

Sample records for additional analytical tools

  1. Proteomics: analytical tools and techniques.

    PubMed

    MacCoss, M J; Yates, J R

    2001-09-01

    Scientists have long been interested in measuring the effects of different stimuli on protein expression and metabolism. Analytical methods are being developed for the automated separation, identification, and quantitation of all of the proteins within the cell. Soon, investigators will be able to observe the effects of an experiment on every protein (as opposed to a selected few). This review presents a discussion of recent technological advances in proteomics in addition to exploring current methodological limitations.

  2. Analytic tools for information warfare

    SciTech Connect

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  3. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  4. Analytical Web Tool for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  5. ATIVS: analytical tool for influenza virus surveillance.

    PubMed

    Liao, Yu-Chieh; Ko, Chin-Yu; Tsai, Ming-Hsin; Lee, Min-Shi; Hsiung, Chao A

    2009-07-01

    The WHO Global Influenza Surveillance Network has routinely performed genetic and antigenic analyses of human influenza viruses to monitor influenza activity. Although these analyses provide supporting data for the selection of vaccine strains, it seems desirable to have user-friendly tools to visualize the antigenic evolution of influenza viruses for the purpose of surveillance. To meet this need, we have developed a web server, ATIVS (Analytical Tool for Influenza Virus Surveillance), for analyzing serological data of all influenza viruses and hemagglutinin sequence data of human influenza A/H3N2 viruses so as to generate antigenic maps for influenza surveillance and vaccine strain selection. Functionalities are described and examples are provided to illustrate its usefulness and performance. The ATIVS web server is available at http://influenza.nhri.org.tw/ATIVS/.

  6. Aptamers: molecular tools for analytical applications.

    PubMed

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review.

  7. Aptamers: molecular tools for analytical applications.

    PubMed

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review. PMID:17581746

  8. Additive manufacturing of tools for lapping glass

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2013-09-01

    Additive manufacturing technologies have the ability to directly produce parts with complex geometries without the need for secondary processes, tooling or fixtures. This ability was used to produce concave lapping tools with a VFlash 3D printer from 3D Systems. The lapping tools were first designed in Creo Parametric with a defined constant radius and radial groove pattern. The models were converted to stereolithography files which the VFlash used in building the parts, layer by layer, from a UV curable resin. The tools were rotated at 60 rpm and used with 120 grit and 220 grit silicon carbide lapping paste to lap 0.750" diameter fused silica workpieces. The samples developed a matte appearance on the lapped surface that started as a ring at the edge of the workpiece and expanded to the center. This indicated that as material was removed, the workpiece radius was beginning to match the tool radius. The workpieces were then cleaned and lapped on a second tool (with equivalent geometry) using a 3000 grit corundum aluminum oxide lapping paste, until a near specular surface was achieved. By using lapping tools that have been additively manufactured, fused silica workpieces can be lapped to approach a specified convex geometry. This approach may enable more rapid lapping of near net shape workpieces that minimize the material removal required by subsequent polishing. This research may also enable development of new lapping tool geometry and groove patterns for improved loose abrasive finishing.

  9. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  10. ANALYTICAL TOOL DEVELOPMENT FOR AFTERTREATMENT SUB-SYSTEMS INTEGRATION

    SciTech Connect

    Bolton, B; Fan, A; Goney, K; Pavlova-MacKinnon, Z; Sisken, K; Zhang, H

    2003-08-24

    The stringent emissions standards of 2007 and beyond require complex engine, aftertreatment and vehicle systems with a high degree of sub-system interaction and flexible control solutions. This necessitates a system-based approach to technology development, in addition to individual sub-system optimization. Analytical tools can provide an effective means to evaluate and develop such complex technology interactions as well as understand phenomena that is either too expensive or impossible to study with conventional experimental means. The analytical effort can also guide experimental development and thus lead to efficient utilization of available experimental resources.A suite of analytical models has been developed to represent PM and NOx aftertreatment sub-systems. These models range from computationally inexpensive zero-dimensional models for real-time control applications to CFD-based, multi-dimensional models with detailed temporal and spatial resolution. Such models in conjunction with well established engine modeling tools such as engine cycle simulation, engine controls modeling, CFD models of non-combusting and combusting flow, and vehicle models provide a comprehensive analytical toolbox for complete engine, aftertreatment and vehicle sub-systems development and system integration applications. However, the fidelity of aftertreatment models and application going forward is limited by the lack of fundamental kinetic data.

  11. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  12. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  13. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  14. Electronic tongue: An analytical gustatory tool

    PubMed Central

    Latha, Rewanthwar Swathi; Lakshmi, P. K.

    2012-01-01

    Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA)-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue) which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields. PMID:22470887

  15. Guidance for the Design and Adoption of Analytic Tools.

    SciTech Connect

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  16. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  17. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  18. Analytical Tools for Cloudscope Ice Measurement

    NASA Technical Reports Server (NTRS)

    Arnott, W. Patrick

    1998-01-01

    The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that

  19. Using Visual Analytics Tool for Improving Data Comprehension

    ERIC Educational Resources Information Center

    Géryk, Jan

    2015-01-01

    The efficacy of animated data visualizations in comparison with static data visualizations is still inconclusive. Some researches resulted that the failure to find out the benefits of animations may relate to the way how they are constructed and perceived. In this paper, we present visual analytics (VA) tool which makes use of enhanced animated…

  20. Medical text analytics tools for search and classification.

    PubMed

    Huang, Jimmy; An, Aijun; Hu, Vivian; Tu, Karen

    2009-01-01

    A text-analytic tool has been developed that accepts clinical medical data as input in order to produce patient details. The integrated tool has the following four characteristics. 1) It has a graphical user interface. 2) It has a free-text search tool that is designed to retrieve records using keywords such as "MI" for myocardial infarction. The result set is a display of those sentences in the medical records that contain the keywords. 3) It has three tools to classify patients based on the likelihood of being diagnosed for myocardial infarction, hypertension, or their smoking status. 4) A summary is generated for each patient selected. Large medical data sets provided by the Institute for Clinical Evaluative Sciences were used during the project.

  1. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  2. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... and other analytical tools for conducting analyses for the planning, design, construction,...

  3. Trial analytics--a tool for clinical trial management.

    PubMed

    Bose, Anindya; Das, Suman

    2012-01-01

    Prolonged timelines and large expenses associated with clinical trials have prompted a new focus on improving the operational efficiency of clinical trials by use of Clinical Trial Management Systems (CTMS) in order to improve managerial control in trial conduct. However, current CTMS systems are not able to meet the expectations due to various shortcomings like inability of timely reporting and trend visualization within/beyond an organization. To overcome these shortcomings of CTMS, clinical researchers can apply a business intelligence (BI) framework to create Clinical Research Intelligence (CLRI) for optimization of data collection and analytics. This paper proposes the usage of an innovative and collaborative visualization tool (CTA) as CTMS "add-on" to help overwhelm these deficiencies of traditional CTMS, with suitable examples.

  4. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  5. Ultrafast 2D NMR: an emerging tool in analytical spectroscopy.

    PubMed

    Giraudeau, Patrick; Frydman, Lucio

    2014-01-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry--from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  6. Network Analytical Tool for Monitoring Global Food Safety Highlights China

    PubMed Central

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P.

    2009-01-01

    Background The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. Methodology/Principal Findings We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003 – August 2008 were processed using network analysis to i) capture complexity, ii) analyze trends, and iii) predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i) Google's PageRank algorithm and ii) the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. Conclusions/Significance This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios. PMID:19688088

  7. Analytical tools for single-molecule fluorescence imaging in cellulo.

    PubMed

    Leake, M C

    2014-07-01

    Recent technological advances in cutting-edge ultrasensitive fluorescence microscopy have allowed single-molecule imaging experiments in living cells across all three domains of life to become commonplace. Single-molecule live-cell data is typically obtained in a low signal-to-noise ratio (SNR) regime sometimes only marginally in excess of 1, in which a combination of detector shot noise, sub-optimal probe photophysics, native cell autofluorescence and intrinsically underlying stochasticity of molecules result in highly noisy datasets for which underlying true molecular behaviour is non-trivial to discern. The ability to elucidate real molecular phenomena is essential in relating experimental single-molecule observations to both the biological system under study as well as offering insight into the fine details of the physical and chemical environments of the living cell. To confront this problem of faithful signal extraction and analysis in a noise-dominated regime, the 'needle in a haystack' challenge, such experiments benefit enormously from a suite of objective, automated, high-throughput analysis tools that can home in on the underlying 'molecular signature' and generate meaningful statistics across a large population of individual cells and molecules. Here, I discuss the development and application of several analytical methods applied to real case studies, including objective methods of segmenting cellular images from light microscopy data, tools to robustly localize and track single fluorescently-labelled molecules, algorithms to objectively interpret molecular mobility, analysis protocols to reliably estimate molecular stoichiometry and turnover, and methods to objectively render distributions of molecular parameters.

  8. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  9. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  10. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  11. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  12. Sociologics: An Analytical Tool for Examining Socioscientific Discourse.

    ERIC Educational Resources Information Center

    Fountain, Renee-Marie

    1998-01-01

    Argues that the framework of sociologists extends commonly used analytical frameworks in socioscientific research in education. Foregrounds the social construction of knowledge and highlights the nature of knowledge production. (DDR)

  13. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  14. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  15. Academic Analytics: A New Tool for a New Era

    ERIC Educational Resources Information Center

    Campbell, John P.; DeBlois, Peter B.; Oblinger, Diana G.

    2007-01-01

    In responding to internal and external pressures for accountability in higher education, especially in the areas of improved learning outcomes and student success, IT leaders may soon become critical partners with academic and student affairs. IT can help answer this call for accountability through "academic analytics," which is emerging as a new…

  16. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  17. Supporting student nurses in practice with additional online communication tools.

    PubMed

    Morley, Dawn A

    2014-01-01

    Student nurses' potential isolation and difficulties of learning on placement have been well documented and, despite attempts to make placement learning more effective, evidence indicates the continuing schism between formal learning at university and situated learning on placement. First year student nurses, entering placement for the first time, are particularly vulnerable to the vagaries of practice. During 2012 two first year student nurse seminar groups (52 students) were voluntarily recruited for a mixed method study to determine the usage of additional online communication support mechanisms (Facebook, wiki, an email group and traditional methods of support using individual email or phone) while undertaking their first five week clinical placement. The study explores the possibility of strengthening clinical learning and support by promoting the use of Web 2.0 support groups for student nurses. Results indicate a high level of interactivity in both peer and academic support in the use of Facebook and a high level of interactivity in one wiki group. Students' qualitative comments voice an appreciation of being able to access university and peer support whilst working individually on placement. Recommendations from the study challenge universities to use online communication tools already familiar to students to complement the support mechanisms that exist for practice learning. This is tempered by recognition of the responsibility of academics to ensure their students are aware of safe and effective online communication.

  18. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  19. Single cell analytic tools for drug discovery and development

    PubMed Central

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  20. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...

  1. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  2. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  3. Galileo's Discorsi as a Tool for the Analytical Art.

    PubMed

    Raphael, Renee Jennifer

    2015-01-01

    A heretofore overlooked response to Galileo's 1638 Discorsi is described by examining two extant copies of the text (one which has received little attention in the historiography, the other apparently unknown) which are heavily annotated. It is first demonstrated that these copies contain annotations made by Seth Ward and Sir Christopher Wren. This article then examines one feature of Ward's and Wren's responses to the Discorsi, namely their decision to re-write several of Galileo's geometrical demonstrations into the language of symbolic algebra. It is argued that this type of active reading of period mathematical texts may have been part of the regular scholarly and pedagogical practices of early modern British mathematicians like Ward and Wren. A set of Appendices contains a transcription and translation of the analytical solutions found in these annotated copies.

  4. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  5. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  6. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  7. Analytical and Semi-Analytical Tools for the Design of Oscillatory Pumping Tests.

    PubMed

    Cardiff, Michael; Barrash, Warren

    2015-01-01

    Oscillatory pumping tests-in which flow is varied in a periodic fashion-provide a method for understanding aquifer heterogeneity that is complementary to strategies such as slug testing and constant-rate pumping tests. During oscillatory testing, pressure data collected at non-pumping wells can be processed to extract metrics, such as signal amplitude and phase lag, from a time series. These metrics are robust against common sensor problems (including drift and noise) and have been shown to provide information about aquifer heterogeneity. Field implementations of oscillatory pumping tests for characterization, however, are not common and thus there are few guidelines for their design and implementation. Here, we use available analytical solutions from the literature to develop design guidelines for oscillatory pumping tests, while considering practical field constraints. We present two key analytical results for design and analysis of oscillatory pumping tests. First, we provide methods for choosing testing frequencies and flow rates which maximize the signal amplitude that can be expected at a distance from an oscillating pumping well, given design constraints such as maximum/minimum oscillator frequency and maximum volume cycled. Preliminary data from field testing helps to validate the methodology. Second, we develop a semi-analytical method for computing the sensitivity of oscillatory signals to spatially distributed aquifer flow parameters. This method can be quickly applied to understand the "sensed" extent of an aquifer at a given testing frequency. Both results can be applied given only bulk aquifer parameter estimates, and can help to optimize design of oscillatory pumping test campaigns. PMID:25535805

  8. categoryCompare, an analytical tool based on feature annotations

    PubMed Central

    Flight, Robert M.; Harrison, Benjamin J.; Mohammad, Fahim; Bunge, Mary B.; Moon, Lawrence D. F.; Petruska, Jeffrey C.; Rouchka, Eric C.

    2014-01-01

    Assessment of high-throughput—omics data initially focuses on relative or raw levels of a particular feature, such as an expression value for a transcript, protein, or metabolite. At a second level, analyses of annotations including known or predicted functions and associations of each individual feature, attempt to distill biological context. Most currently available comparative- and meta-analyses methods are dependent on the availability of identical features across data sets, and concentrate on determining features that are differentially expressed across experiments, some of which may be considered “biomarkers.” The heterogeneity of measurement platforms and inherent variability of biological systems confounds the search for robust biomarkers indicative of a particular condition. In many instances, however, multiple data sets show involvement of common biological processes or signaling pathways, even though individual features are not commonly measured or differentially expressed between them. We developed a methodology, categoryCompare, for cross-platform and cross-sample comparison of high-throughput data at the annotation level. We assessed the utility of the approach using hypothetical data, as well as determining similarities and differences in the set of processes in two instances: (1) denervated skin vs. denervated muscle, and (2) colon from Crohn's disease vs. colon from ulcerative colitis (UC). The hypothetical data showed that in many cases comparing annotations gave superior results to comparing only at the gene level. Improved analytical results depended as well on the number of genes included in the annotation term, the amount of noise in relation to the number of genes expressing in unenriched annotation categories, and the specific method in which samples are combined. In the skin vs. muscle denervation comparison, the tissues demonstrated markedly different responses. The Crohn's vs. UC comparison showed gross similarities in inflammatory

  9. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    SciTech Connect

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  10. The RESET tephra database and associated analytical tools

    NASA Astrophysics Data System (ADS)

    Bronk Ramsey, Christopher; Housley, Rupert A.; Lane, Christine S.; Smith, Victoria C.; Pollard, A. Mark

    2015-06-01

    An open-access database has been set up to support the research project studying the 'Response of Humans to Abrupt Environmental Transitions' (RESET). The main methodology underlying this project was to use tephra layers to tie together and synchronise the chronologies of stratigraphic records at archaeological and environmental sites. The database has information on occurrences, and chemical compositions, of glass shards from tephra and cryptotephra deposits found across Europe. The data includes both information from the RESET project itself and from the published literature. With over 12,000 major element analyses and over 3000 trace element analyses on glass shards, relevant to 80 late Quaternary eruptions, the RESET project has generated an important archive of data. When added to the published information, the database described here has a total of more than 22,000 major element analyses and nearly 4000 trace element analyses on glass from over 240 eruptions. In addition to the database and its associated data, new methods of data analysis for assessing correlations have been developed as part of the project. In particular an approach using multi-dimensional kernel density estimates to evaluate the likelihood of tephra compositions matching is described here and tested on data generated as part of the RESET project.

  11. Desktop modeling as a management tool for budgeting, forecasting, and reporting in an analytical laboratory

    SciTech Connect

    Hodge, C.A.

    1995-07-01

    Managers are often required to quickly and accurately estimate resource needs. At times, additional work can be absorbed without additional resources. At other times, threshold resource boundaries are exceeded requiring an additional quantum of a specific resource. Cost savings` estimates, resulting from a reduction in efforts, are also increasingly becoming a requirement of today`s managers. The modeling effort described in this paper was designed to estimate instrumentation and manpower resource needs for an analytical laboratory. It was written using only simple spreadsheet software. Analysis can be readily performed with a minimum of input and results obtained in a matter of minutes. This model has been tuned with many years of empirical data yielding a high degree of capability. The model was expanded to meet other needs. It can be used to justify capital expenditure when the ultimate result is cost savings; to examine procedures and operations for efficiency increases; and for reporting and regulatory compliance. This paper demonstrates that accurate and credible estimates of resource needs can be readily obtained with a minimum of effort or specialized knowledge employing only tools that are readily available in today`s business environment.

  12. Non invasive ventilation as an additional tool for exercise training.

    PubMed

    Ambrosino, Nicolino; Cigni, Paolo

    2015-01-01

    Recently, there has been increasing interest in the use of non invasive ventilation (NIV) to increase exercise capacity. In individuals with COPD, NIV during exercise reduces dyspnoea and increases exercise tolerance. Different modalities of mechanical ventilation have been used non-invasively as a tool to increase exercise tolerance in COPD, heart failure and lung and thoracic restrictive diseases. Inspiratory support provides symptomatic benefit by unloading the ventilatory muscles, whereas Continuous Positive Airway Pressure (CPAP) counterbalances the intrinsic positive end-expiratory pressure in COPD patients. Severe stable COPD patients undergoing home nocturnal NIV and daytime exercise training showed some benefits. Furthermore, it has been reported that in chronic hypercapnic COPD under long-term ventilatory support, NIV can also be administered during walking. Despite these results, the role of NIV as a routine component of pulmonary rehabilitation is still to be defined. PMID:25874110

  13. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    PubMed

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  14. Analytic method for three-center nuclear attraction integrals: a generalization of the Gegenbauer addition theorem

    SciTech Connect

    Weatherford, C.A.

    1988-01-01

    A completely analytic method for evaluating three-center nuclear-attraction integrals for STOS is presented. The method exploits a separation of the STO into an evenly loaded solid harmonic and a OS STO. The harmonics are translated to the molecular center of mass in closed finite terms. The OS STO is translated using the Gegenbauer addition theorem; ls STOS are translated using a single parametric differentiation of the OS formula. Explicit formulas for the integrals are presented for arbitrarily located atoms. A numerical example is given to illustrate the method.

  15. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  16. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  17. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  18. The Use of Economic Analytical Tools in Quantifying and Measuring Educational Benefits and Costs.

    ERIC Educational Resources Information Center

    Holleman, I. Thomas, Jr.

    The general objective of this study was to devise quantitative guidelines that school officials can accurately follow in using benefit-cost analysis, cost-effectiveness analysis, ratio analysis, and other similar economic analytical tools in their particular local situations. Specifically, the objectives were to determine guidelines for the…

  19. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  20. Designing a Collaborative Visual Analytics Tool for Social and Technological Change Prediction.

    SciTech Connect

    Wong, Pak C.; Leung, Lai-Yung R.; Lu, Ning; Scott, Michael J.; Mackey, Patrick S.; Foote, Harlan P.; Correia, James; Taylor, Zachary T.; Xu, Jianhua; Unwin, Stephen D.; Sanfilippo, Antonio P.

    2009-09-01

    We describe our ongoing efforts to design and develop a collaborative visual analytics tool to interactively model social and technological change of our society in a future setting. The work involves an interdisciplinary team of scientists from atmospheric physics, electrical engineering, building engineering, social sciences, economics, public policy, and national security. The goal of the collaborative tool is to predict the impact of global climate change on the U.S. power grids and its implications for society and national security. These future scenarios provide critical assessment and information necessary for policymakers and stakeholders to help formulate a coherent, unified strategy toward shaping a safe and secure society. The paper introduces the problem background and related work, explains the motivation and rationale behind our design approach, presents our collaborative visual analytics tool and usage examples, and finally shares the development challenge and lessons learned from our investigation.

  1. Volume, Variety and Veracity of Big Data Analytics in NASA's Giovanni Tool

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Hegde, M.; Smit, C.; Pan, J.; Bryant, K.; Chidambaram, C.; Zhao, P.

    2013-12-01

    Earth Observation data have posed challenges to NASA users ever since the launch of several satellites around the turn of the century, generating volumes now measured in petabytes, a volume growth further increased by models assimilating the satellite data. One important approach to bringing Big Data Analytic capabilities to bear on the Volume of data has been the provision of server-side analysis capabilities. For instance, the Geospatial Interactive Online Visualization ANd aNalysis (Giovanni) tool provides a web interface to large volumes of gridded data from several EOSDIS data centers. Giovanni's main objective is to allow the user to explore its data holdings using various forms of visualization and data summarization or aggregation algorithms, thus allowing the user to examine statistics and pictures for the overall data, while eventually acquiring only the most useful data. Thus much of the preprocessing and data reduction aspects can take place on the server, delivering manageable information quantities to the user. In addition to Volume, Giovanni uses open standards to tackle the Variety aspect of Big Data, incorporating data stored in several formats, from several data centers, and making them available in a uniform data format and structure to both the Giovanni algorithms and the end user. The Veracity aspect of Big Data, perhaps the stickiest of wickets, is enhanced through features that enable reproducibility (provenance and URL-driven workflows), and by a Help Desk staffed by scientists with expertise in the science data.

  2. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  3. Simultaneous determination of antazoline and naphazoline by the net analyte signal standard addition method and spectrophotometric technique.

    PubMed

    Asadpour-Zeynali, Karim; Ghavami, Raoof; Esfandiari, Roghayeh; Soheili-Azad, Payam

    2010-01-01

    A novel net analyte signal standard addition method (NASSAM) was used for simultaneous determination of the drugs anthazoline and naphazoline. The NASSAM can be applied for determination of analytes in the presence of known interferents. The proposed method is used to eliminate the calibration and prediction steps of multivariate calibration methods; the determination is carried out in a single step for each analyte. The accuracy of the predictions against the H-point standard addition method is independent of the shape of the analyte and interferent spectra. The net analyte signal concept was also used to calculate multivariate analytical figures of merit, such as LOD, selectivity, and sensitivity. The method was successfully applied to the simultaneous determination of anthazoline and naphazoline in a commercial eye drop sample.

  4. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  5. The impact of layer thickness on the performance of additively manufactured lapping tools

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2015-10-01

    Lower cost additive manufacturing (AM) machines which have emerged in recent years are capable of producing tools, jigs, and fixtures that are useful in optical fabrication. In particular, AM tooling has been shown to be useful in lapping glass workpieces. Various AM machines are distinguished by the processes, materials, build times, and build resolution they provide. This research investigates the impact of varied build resolution (specifically layer resolution) on the lapping performance of tools built using the stereolithographic assembly (SLA) process in 50 μm and 100 μm layer thicknesses with a methacrylate photopolymer resin on a high resolution desktop printer. As with previous work, the lapping tools were shown to remove workpiece material during the lapping process, but the tools themselves also experienced significant wear on the order of 2-3 times the mass loss of the glass workpieces. The tool wear rates for the 100 μm and 50 μm layer tools were comparable, but the 50 μm layer tool was 74% more effective at removing material from the glass workpiece, which is attributed to some abrasive particles being trapped in the coarser surface of the 100 um layer tooling and not being available to interact with the glass workpiece. Considering the tool wear, these additively manufactured tools are most appropriate for prototype tooling where the low cost (<$45) and quick turnaround make them attractive when compared to a machined tool.

  6. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  7. Analytical optimal controls for the state constrained addition and removal of cryoprotective agents

    PubMed Central

    Chicone, Carmen C.; Critser, John K.

    2014-01-01

    Cryobiology is a field with enormous scientific, financial and even cultural impact. Successful cryopreservation of cells and tissues depends on the equilibration of these materials with high concentrations of permeating chemicals (CPAs) such as glycerol or 1,2 propylene glycol. Because cells and tissues are exposed to highly anisosmotic conditions, the resulting gradients cause large volume fluctuations that have been shown to damage cells and tissues. On the other hand, there is evidence that toxicity to these high levels of chemicals is time dependent, and therefore it is ideal to minimize exposure time as well. Because solute and solvent flux is governed by a system of ordinary differential equations, CPA addition and removal from cells is an ideal context for the application of optimal control theory. Recently, we presented a mathematical synthesis of the optimal controls for the ODE system commonly used in cryobiology in the absence of state constraints and showed that controls defined by this synthesis were optimal. Here we define the appropriate model, analytically extend the previous theory to one encompassing state constraints, and as an example apply this to the critical and clinically important cell type of human oocytes, where current methodologies are either difficult to implement or have very limited success rates. We show that an enormous increase in equilibration efficiency can be achieved under the new protocols when compared to classic protocols, potentially allowing a greatly increased survival rate for human oocytes, and pointing to a direction for the cryopreservation of many other cell types. PMID:22527943

  8. A new analytical equation of state for additive hard sphere fluid mixtures

    NASA Astrophysics Data System (ADS)

    Barrio, C.; Solana, J. R.

    A study has been made of the relation between the equation of state of additive binary hard sphere fluid mixtures and the equation of state of a pure hard sphere fluid for the same packing fraction. An analysis of the existing simulation data for a wide variety of compositions of the mixture and diameter ratios up to 1/0.2 makes it possible to conclude that the ratio of the excess compressibility factor of the mixture to that of the pure fluid is, to a very good approximation, a linear function of the packing fraction. This suggests the possibility of deriving the equation of state of the mixture from that of the pure fluid by using the second and third virial coefficients of the mixture, which are known analytically, to reproduce the linear relation mentioned above. When a suitable equation of state is chosen for the pure fluid, the results from the equation of state of the mixture thus obtained are in excellent agreement with simulation data. The predictions for the fourth and fifth virial coefficients also are very accurate compared with known numerical data.

  9. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  10. Analytical Ultracentrifugation as a Tool to Study Nonspecific Protein–DNA Interactions

    PubMed Central

    Yang, Teng-Chieh; Catalano, Carlos Enrique; Maluf, Nasib Karl

    2016-01-01

    Analytical ultracentrifugation (AUC) is a powerful tool that can provide thermodynamic information on associating systems. Here, we discuss how to use the two fundamental AUC applications, sedimentation velocity (SV), and sedimentation equilibrium (SE), to study nonspecific protein–nucleic acid interactions, with a special emphasis on how to analyze the experimental data to extract thermodynamic information. We discuss three specific applications of this approach: (i) determination of nonspecific binding stoichiometry of E. coli integration host factor protein to dsDNA, (ii) characterization of nonspecific binding properties of Adenoviral IVa2 protein to dsDNA using SE-AUC, and (iii) analysis of the competition between specific and nonspecific DNA-binding interactions observed for E. coli integration host factor protein assembly on dsDNA. These approaches provide powerful tools that allow thermodynamic interrogation and thus a mechanistic understanding of how proteins bind nucleic acids by both specific and nonspecific interactions. PMID:26412658

  11. Generalized net analyte signal standard addition as a novel method for simultaneous determination: application in spectrophotometric determination of some pesticides.

    PubMed

    Asadpour-Zeynali, Karim; Saeb, Elhameh; Vallipour, Javad; Bamorowat, Mehdi

    2014-01-01

    Simultaneous spectrophotometric determination of three neonicotinoid insecticides (acetamiprid, imidacloprid, and thiamethoxam) by a novel method named generalized net analyte signal standard addition method (GNASSAM) in some binary and ternary synthetic mixtures was investigated. For this purpose, standard addition was performed using a single standard solution consisting of a mixture of standards of all analytes. Savings in time and amount of used materials are some of the advantages of this method. All determinations showed appropriate applicability of this method with less than 5% error. This method may be applied for linearly dependent data in the presence of known interferents. The GNASSAM combines the advantages of both the generalized standard addition method and net analyte signal; therefore, it may be a proper alternative for some other multivariate methods. PMID:24672886

  12. Process analytical tools for monitoring, understanding, and control of pharmaceutical fluidized bed granulation: A review.

    PubMed

    Burggraeve, Anneleen; Monteyne, Tinne; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2013-01-01

    Fluidized bed granulation is a widely applied wet granulation technique in the pharmaceutical industry to produce solid dosage forms. The process involves the spraying of a binder liquid onto fluidizing powder particles. As a result, the (wetted) particles collide with each other and form larger permanent aggregates (granules). After spraying the required amount of granulation liquid, the wet granules are rapidly dried in the fluid bed granulator. Since the FDA launched its Process Analytical Technology initiative (and even before), a wide range of analytical process sensors has been used for real-time monitoring and control of fluid bed granulation processes. By applying various data analysis techniques to the multitude of data collected from the process analyzers implemented in fluid bed granulators, a deeper understanding of the process has been achieved. This review gives an overview of the process analytical technologies used during fluid bed granulation to monitor and control the process. The fundamentals of the mechanisms contributing to wet granule growth and the characteristics of fluid bed granulation processing are briefly discussed. This is followed by a detailed overview of the in-line applied process analyzers, contributing to improved fluid bed granulation understanding, modeling, control, and endpoint detection. Analysis and modeling tools enabling the extraction of the relevant information from the complex data collected during granulation and the control of the process are highlighted.

  13. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    PubMed Central

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  14. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  15. Revisiting the use of 'place' as an analytic tool for elucidating geographic issues central to Canadian rural palliative care.

    PubMed

    Giesbrecht, Melissa; Crooks, Valorie A; Castleden, Heather; Schuurman, Nadine; Skinner, Mark W; Williams, Allison M

    2016-09-01

    In 2010, Castleden and colleagues published a paper in this journal using the concept of 'place' as an analytic tool to understand the nature of palliative care provision in a rural region in British Columbia, Canada. This publication was based upon pilot data collected for a larger research project that has since been completed. With the addition of 40 semi-structured interviews with users and providers of palliative care in four other rural communities located across Canada, we revisit Castleden and colleagues' (2010) original framework. Applying the concept of place to the full dataset confirmed the previously published findings, but also revealed two new place-based dimensions related to experiences of rural palliative care in Canada: (1) borders and boundaries; and (2) 'making' place for palliative care progress. These new findings offer a refined understanding of the complex interconnections between various dimensions of place and palliative care in rural Canada. PMID:27521815

  16. Analytical representation of the higher virial coefficients of binary mixtures of additive hard spheres

    NASA Astrophysics Data System (ADS)

    Barrio, C.; Solana, J. R.

    2003-01-01

    Approximate expressions for the fourth and fifth virial coefficients of binary hard-sphere fluid mixtures are derived. The procedure used to obtain these expressions is based on that previously proposed by Wheatley [J. chem. Phys., 111, 5455 (1999)], but slightly modified. Wheatley's procedure starts from a prescribed general analytical form of the virial coefficients, from which the particular expression for each virial coefficient is obtained by imposing to the general form a number of limiting conditions. Here, we propose an alternative general expression of the virial coefficients and derive one more condition. This condition is satisfied when the fourth and fifth virial coefficients are expressed in the form we propose, but not when they are expressed in Wheatley's form. The agreement of the proposed analytical expressions with exact numerical data is excellent. The procedure can be extended to higher virial coefficients, although the lack of exact numerical data prevents any comparison.

  17. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  18. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    PubMed Central

    Alaidi, Osama; Rames, Matthew J.

    2016-01-01

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  19. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    PubMed

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb.

  20. The use of meta-analytical tools in risk assessment for food safety.

    PubMed

    Gonzales-Barron, Ursula; Butler, Francis

    2011-06-01

    This communication deals with the use of meta-analysis as a valuable tool for the synthesis of food safety research, and in quantitative risk assessment modelling. A common methodology for the conduction of meta-analysis (i.e., systematic review and data extraction, parameterisation of effect size, estimation of overall effect size, assessment of heterogeneity, and presentation of results) is explained by reviewing two meta-analyses derived from separate sets of primary studies of Salmonella in pork. Integrating different primary studies, the first meta-analysis elucidated for the first time a relationship between the proportion of Salmonella-carrier slaughter pigs entering the slaughter lines and the resulting proportion of contaminated carcasses at the point of evisceration; finding that the individual studies on their own could not reveal. On the other hand, the second application showed that meta-analysis can be used to estimate the overall effect of a critical process stage (chilling) on the incidence of the pathogen under study. The derivation of a relationship between variables and a probabilistic distribution is illustrations of the valuable quantitative information synthesised by the meta-analytical tools, which can be incorporated in risk assessment modelling. Strengths and weaknesses of meta-analysis within the context of food safety are also discussed.

  1. STRMDEPL08 - An Extended Version of STRMDEPL with Additional Analytical Solutions to Calculate Streamflow Depletion by Nearby Pumping Wells

    USGS Publications Warehouse

    Reeves, Howard W.

    2008-01-01

    STRMDEPL, a one-dimensional model using two analytical solutions to calculate streamflow depletion by a nearby pumping well, was extended to account for two additional analytical solutions. The extended program is named STRMDEPL08. The original program incorporated solutions for a stream that fully penetrates the aquifer with and without streambed resistance to ground-water flow. The modified program includes solutions for a partially penetrating stream with streambed resistance and for a stream in an aquitard subjected to pumping from an underlying leaky aquifer. The code also was modified to allow the user to input pumping variations at other than 1-day intervals. The modified code is shown to correctly evaluate the analytical solutions and to provide correct results for half-day time intervals.

  2. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  3. Measuring the Bright Side of Being Blue: A New Tool for Assessing Analytical Rumination in Depression

    PubMed Central

    Barbic, Skye P.; Durisko, Zachary; Andrews, Paul W.

    2014-01-01

    Background Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR) is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. Methods Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. Results Data were high quality (<1% missing; high reliability: Cronbach's alpha  = 0.92, test-retest intraclass correlations >0.81; evidence for divergent validity). Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df  = 76, p = 0.07), with high reliability (rp = 0.86), ordered response scale structure, and no item bias (gender, age, time). Conclusion Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ) that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major

  4. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  5. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  6. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  7. Experimental model and analytic solution for real-time observation of vehicle's additional steer angle

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolong; Li, Liang; Pan, Deng; Cao, Chengmao; Song, Jian

    2014-03-01

    The current research of real-time observation for vehicle roll steer angle and compliance steer angle(both of them comprehensively referred as the additional steer angle in this paper) mainly employs the linear vehicle dynamic model, in which only the lateral acceleration of vehicle body is considered. The observation accuracy resorting to this method cannot meet the requirements of vehicle real-time stability control, especially under extreme driving conditions. The paper explores the solution resorting to experimental method. Firstly, a multi-body dynamic model of a passenger car is built based on the ADAMS/Car software, whose dynamic accuracy is verified by the same vehicle's roadway test data of steady static circular test. Based on this simulation platform, several influencing factors of additional steer angle under different driving conditions are quantitatively analyzed. Then ɛ-SVR algorithm is employed to build the additional steer angle prediction model, whose input vectors mainly include the sensor information of standard electronic stability control system(ESC). The method of typical slalom tests and FMVSS 126 tests are adopted to make simulation, train model and test model's generalization performance. The test result shows that the influence of lateral acceleration on additional steer angle is maximal (the magnitude up to 1°), followed by the longitudinal acceleration-deceleration and the road wave amplitude (the magnitude up to 0.3°). Moreover, both the prediction accuracy and the calculation real-time of the model can meet the control requirements of ESC. This research expands the accurate observation methods of the additional steer angle under extreme driving conditions.

  8. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). PMID:26873463

  9. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data).

  10. Sugar Maple Pigments Through the Fall and the Role of Anthocyanin as an Analytical Tool

    NASA Astrophysics Data System (ADS)

    Lindgren, E.; Rock, B.; Middleton, E.; Aber, J.

    2008-12-01

    Sugar maple habitat is projected to almost disappear in future climate scenarios. In fact, many institutions state that these trees are already in decline. Being able to detect sugar maple health could prove to be a useful analytical tool to monitor changes in phenology. Anthocyanin, a red pigment found in sugar maples, is thought to be a universal indicator of plant stress. It is very prominent in the spring during the first flush of leaves, as well as in the fall as leaves senesce. Determining an anthocyanin index that could be used with satellite systems will provide a greater understanding of tree phenology and the distribution of plant stress, both over large areas as well as changes over time. The utilization of anthocyanin for one of it's functions, prevention of oxidative stress, may fluctuate in response to changing climatic conditions that occur during senescence or vary from year to year. By monitoring changes in pigment levels and antioxidant capacity through the fall, one may be able to draw conclusions about the ability to detect anthocyanin remotely from space-based systems, and possibly determine a more specific function for anthocyanin during fall senescence. These results could then be applied to track changes in tree stress.

  11. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    PubMed

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences.

  12. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be... models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to...

  13. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  14. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    PubMed

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products. PMID:27394712

  15. A graphical tool for an analytical approach of scattering photons by the Compton effect

    NASA Astrophysics Data System (ADS)

    Scannavino, Francisco A.; Cruvinel, Paulo E.

    2012-05-01

    The photons scattered by the Compton effect can be used to characterize the physical properties of a given sample due to the influence that the electron density exerts on the number of scattered photons. However, scattering measurements involve experimental and physical factors that must be carefully analyzed to predict uncertainty in the detection of Compton photons. This paper presents a method for the optimization of the geometrical parameters of an experimental arrangement for Compton scattering analysis, based on its relations with the energy and incident flux of the X-ray photons. In addition, the tool enables the statistical analysis of the information displayed and includes the coefficient of variation (CV) measurement for a comparative evaluation of the physical parameters of the model established for the simulation.

  16. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    SciTech Connect

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  17. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  18. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    PubMed

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  19. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    PubMed

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software. PMID:27627408

  20. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  1. Repurposing mainstream CNC machine tools for laser-based additive manufacturing

    NASA Astrophysics Data System (ADS)

    Jones, Jason B.

    2016-04-01

    The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.

  2. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  3. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  4. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  5. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    PubMed Central

    2012-01-01

    Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc) that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics. PMID:23153033

  6. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  7. Multi-site study of additive genetic effects on fractional anisotropy of cerebral white matter: comparing meta and mega analytical approaches for data pooling

    PubMed Central

    Kochunov, Peter; Jahanshad, Neda; Sprooten, Emma; Nichols, Thomas E.; Mandl, René C.; Almasy, Laura; Booth, Tom; Brouwer, Rachel M.; Curran, Joanne E.; de Zubicaray, Greig I.; Dimitrova, Rali; Duggirala, Ravi; Fox, Peter T.; Hong, L. Elliot; Landman, Bennett A.; Lemaitre, Hervé; Lopez, Lorna; Martin, Nicholas G.; McMahon, Katie L.; Mitchell, Braxton D.; Olvera, Rene L.; Peterson, Charles P.; Starr, John M.; Sussmann, Jessika E.; Toga, Arthur W.; Wardlaw, Joanna M.; Wright, Margaret J.; Wright, Susan N.; Bastin, Mark E.; McIntosh, Andrew M.; Boomsma, Dorret I.; Kahn, René S.; den Braber, Anouk; de Geus, Eco JC; Deary, Ian J.; Hulshoff Pol, Hilleke E.; Williamson, Douglas E.; Blangero, John; van ’t Ent, Dennis; Thompson, Paul M.; Glahn, David C.

    2014-01-01

    Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9–85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large “mega-family”. We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability. PMID:24657781

  8. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    PubMed Central

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  9. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy.

    PubMed

    Tang, Bang-Cheng; Cai, Chen-Bo; Shi, Wei; Xu, Lu

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  10. Analytical prediction of chatter stability for variable pitch and variable helix milling tools

    NASA Astrophysics Data System (ADS)

    Sims, N. D.; Mann, B.; Huyanan, S.

    2008-11-01

    Regenerative chatter is a self-excited vibration that can occur during milling and other machining processes. It leads to a poor surface finish, premature tool wear, and potential damage to the machine or tool. Variable pitch and variable helix milling tools have been previously proposed to avoid the onset of regenerative chatter. Although variable pitch tools have been considered in some detail in previous research, this has generally focussed on behaviour at high radial immersions. In contrast there has been very little work focussed on predicting the stability of variable helix tools. In the present study, three solution processes are proposed for predicting the stability of variable pitch or helix milling tools. The first is a semi-discretisation formulation that performs spatial and temporal discretisation of the tool. Unlike previously published methods this can predict the stability of variable pitch or variable helix tools, at low or high radial immersions. The second is a time-averaged semi-discretisation formulation that assumes time-averaged cutting force coefficients. Unlike previous work, this can predict stability of variable helix tools at high radial immersion. The third is a temporal-finite element formulation that can predict the stability of variable pitch tools with a constant uniform helix angle, at low radial immersion. The model predictions are compared to previously published work on variable pitch tools, along with time-domain model simulations. Good agreement is found with both previously published results and the time-domain model. Furthermore, cyclic-fold bifurcations were found to exist for both variable pitch and variable helix tools at lower radial immersions.

  11. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  12. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  13. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    SciTech Connect

    Hayden, D. W.

    2004-11-01

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried to develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of

  14. Analytical ultracentrifugation: A versatile tool for the characterisation of macromolecular complexes in solution.

    PubMed

    Patel, Trushar R; Winzor, Donald J; Scott, David J

    2016-02-15

    Analytical ultracentrifugation, an early technique developed for characterizing quantitatively the solution properties of macromolecules, remains a powerful aid to structural biologists in their quest to understand the formation of biologically important protein complexes at the molecular level. Treatment of the basic tenets of the sedimentation velocity and sedimentation equilibrium variants of analytical ultracentrifugation is followed by considerations of the roles that it, in conjunction with other physicochemical procedures, has played in resolving problems encountered in the delineation of complex formation for three biological systems - the cytoplasmic dynein complex, mitogen-activated protein kinase (ERK2) self-interaction, and the terminal catalytic complex in selenocysteine synthesis. PMID:26555086

  15. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  16. Usefulness of anterior uveitis as an additional tool for diagnosing incomplete Kawasaki disease

    PubMed Central

    Lee, Kyu Jin; Kim, Hyo Jin; Kim, Min Jae; Yoon, Ji Hong; Lee, Eun Jung; Lee, Jae Young; Oh, Jin Hee; Lee, Soon Ju; Lee, Kyung Yil

    2016-01-01

    Purpose There are no specific tests for diagnosing Kawasaki disease (KD). Additional diagnostic criteria are needed to prevent the delayed diagnosis of incomplete Kawasaki disease (IKD). This study compared the frequency of coronary artery lesions (CALs) in IKD patients with and without anterior uveitis (AU) and elucidated whether the finding of AU supported the diagnosis of IKD. Methods This study enrolled patients diagnosed with IKD at The Catholic University of Korea, Uijeongbu St. Mary's Hospital from January 2010 to December 2014. The patients were divided into 2 groups: group 1 included patients with IKD having AU; and group 2 included patients with IKD without AU. We analyzed the demographic and clinical data (age, gender, duration of fever, and the number of diagnostic criteria), laboratory results, and echocardiographic findings. Results Of 111 patients with IKD, 41 had uveitis (36.98%, group 1) and 70 did not (63.02%, group 2). Patients in group 1 had received a diagnosis and treatment earlier, and had fewer CALs (3 of 41, 1.7%) than those in group 2 (20 of 70, 28.5%) (P=0.008). All 3 patients with CALs in group 1 had coronary dilatation, while patients with CALs in group 2 had CALs ranging from coronary dilatation to giant aneurysm. Conclusion The diagnosis of IKD is challenging but can be supported by the presence of features such as AU. Group 1 had a lower risk of coronary artery disease than group 2. Therefore, the presence of AU is helpful in the early diagnosis and treatment of IKD and can be used as an additional diagnostic tool. PMID:27186227

  17. Visual analytical tool for evaluation of 10-year perioperative transfusion practice at a children's hospital.

    PubMed

    Gálvez, Jorge A; Ahumada, Luis; Simpao, Allan F; Lin, Elaina E; Bonafide, Christopher P; Choudhry, Dhruv; England, William R; Jawad, Abbas F; Friedman, David; Sesok-Pizzini, Debora A; Rehman, Mohamed A

    2014-01-01

    Children are a vulnerable population in the operating room, and are particularly at risk of complications from unanticipated hemorrhage. The decision to prepare blood products prior to surgery varies depending on the personal experience of the clinician caring for the patient. We present the first application of a data visualization technique to study large datasets in the context of blood product transfusions at a tertiary pediatric hospital. The visual analytical interface allows real-time interaction with datasets from 230 000 procedure records. Clinicians can use the visual analytical interface to analyze blood product usage based on procedure- and patient-specific factors, and then use that information to guide policies for ordering blood products.

  18. Biosensors as new analytical tool for detection of Genetically Modified Organisms (GMOs).

    PubMed

    Minunni, M; Tombelli, S; Mariotti, E; Mascini, M; Mascini, M

    2001-04-01

    Three different biosensors for detection of Genetically Modified Organisms (GMOs) are presented. The sensing principle is based on the affinity interaction between nucleic acids: the probe is immobilised on the sensor surface and the target analyte is free in solution. The immobilised probes are specific for most inserted sequences in GMOs: the promoter P35S and the terminator TNOS. Electrochemical methods with screen-printed electrodes, piezoelectric and optical (SPR) transduction principles were applied.

  19. The Facial Aesthetic index: An additional tool for assessing treatment need

    PubMed Central

    Sundareswaran, Shobha; Ramakrishnan, Ranjith

    2016-01-01

    Objectives: Facial Aesthetics, a major consideration in orthodontic diagnosis and treatment planning, may not be judged correctly and completely by simply analyzing dental occlusion or osseous structures. Despite this importance, there is no index to guarantee availability of treatment or prioritize patients based on their soft tissue treatment needs. Individuals having well-aligned teeth but unaesthetic convex profiles do not get included for treatment as per current malocclusion indices. The aim of this investigation is to develop an aesthetic index based on facial profiles which could be used as an additional tool with malocclusion indices. Materials and Methods: A chart showing typical facial profile changes due to underlying malocclusions was generated by soft tissue manipulations of standardized profile photographs of a well-balanced male and female face. A panel of 62 orthodontists judged the profile photographs of 100 patients with different soft tissue patterns for assessing profile variations and treatment need. The index was later tested in a cross-section of school population. Statistical analysis was done using “irr” package of R environment version 2.15.1. Results: The index exhibited very good reliability in determining profile variations (Fleiss kappa 0.866, P < 0.001), excellent reproducibility (kappa 0.9078), high sensitivity, and specificity (95.7%). Testing in population yielded excellent agreement among orthodontists (kappa 0.9286). Conclusions: A new Facial Aesthetic index, based on patient's soft tissue profile requirements is proposed, which can complement existing indices to ensure treatment to those in need. PMID:27127752

  20. Surveillance of Travellers: An Additional Tool for Tracking Antimalarial Drug Resistance in Endemic Countries

    PubMed Central

    Gharbi, Myriam; Flegg, Jennifer A.; Pradines, Bruno; Berenger, Ako; Ndiaye, Magatte; Djimdé, Abdoulaye A.; Roper, Cally; Hubert, Véronique; Kendjo, Eric; Venkatesan, Meera; Brasseur, Philippe; Gaye, Oumar; Offianan, André T.; Penali, Louis; Le Bras, Jacques; Guérin, Philippe J.; Study, Members of the French National Reference Center for Imported Malaria

    2013-01-01

    Introduction There are growing concerns about the emergence of resistance to artemisinin-based combination therapies (ACTs). Since the widespread adoption of ACTs, there has been a decrease in the systematic surveillance of antimalarial drug resistance in many malaria-endemic countries. The aim of this work was to test whether data on travellers returning from Africa with malaria could serve as an additional surveillance system of local information sources for the emergence of drug resistance in endemic-countries. Methodology Data were collected from travellers with symptomatic Plasmodium falciparum malaria returning from Senegal (n = 1,993), Mali (n = 2,372), Cote d’Ivoire (n = 4,778) or Cameroon (n = 3,272) and recorded in the French Malaria Reference Centre during the period 1996–2011. Temporal trends of the proportion of parasite isolates that carried the mutant genotype, pfcrt 76T, a marker of resistance to chloroquine (CQ) and pfdhfr 108N, a marker of resistance to pyrimethamine, were compared for travellers and within-country surveys that were identified through a literature review in PubMed. The in vitro response to CQ was also compared between these two groups for parasites from Senegal. Results The trends in the proportion of parasites that carried pfcrt 76T, and pfdhfr 108N, were compared for parasites from travellers and patients within-country using the slopes of the curves over time; no significant differences in the trends were found for any of the 4 countries. These results were supported by in vitro analysis of parasites from the field in Senegal and travellers returning to France, where the trends were also not significantly different. Conclusion The results have not shown different trends in resistance between parasites derived from travellers or from parasites within-country. This work highlights the value of an international database of drug responses in travellers as an additional tool to assess the emergence of drug

  1. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  2. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  3. Energy-dispersive X-ray fluorescence systems as analytical tool for assessment of contaminated soils.

    PubMed

    Vanhoof, Chris; Corthouts, Valère; Tirez, Kristof

    2004-04-01

    To determine the heavy metal content in soil samples at contaminated locations, a static and time consuming procedure is used in most cases. Soil samples are collected and analyzed in the laboratory at high quality and high analytical costs. The demand by government and consultants for a more dynamic approach and by customers requiring performances in which analyses are performed in the field with immediate feedback of the analytical results, is growing. Especially during the follow-up of remediation projects or during the determination of the sampling strategy, field analyses are advisable. For this purpose four types of ED-XRF systems, ranging from portable up to high performance laboratory systems, have been evaluated. The evaluation criteria are based on the performance characteristics for all the ED-XRF systems such as limit of detection, accuracy and the measurement uncertainty on one hand, and also the influence of the sample pretreatment on the obtained results on the other hand. The study proved that the field portable system and the bench top system, placed in a mobile van, can be applied as field techniques, resulting in semi-quantitative analytical results. A limited homogenization of the analyzed sample significantly increases the representativeness of the soil sample. The ED-XRF systems can be differentiated by their limits of detection which are a factor of 10 to 20 higher for the portable system. The accuracy of the results and the measurement uncertainty also improved using the bench top system. Therefore, the selection criteria for applicability of both field systems are based on the required detection level and also the required accuracy of the results.

  4. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  5. Molecular modelling: An analytical tool with a predictive character for investigating reactivity in molten salt media.

    NASA Astrophysics Data System (ADS)

    Picard, Gérard S.; Bouyer, Frédéric C.

    1995-04-01

    Possibilities offered by Molecular Modelling for studying homogeneous and interfacial processes and reactions in melts are discussed. A few typical illustrative examples covering some of the main research fields of molten salt chemistry and electrochemistry are given. Quantum chemistry calculations, Molecular Dynamics and Monte Carlo methods appear to be fantastic tools for analyzing and predicting reactivity in molten salts.

  6. Analytical Tools To Distinguish the Effects of Localization Error, Confinement, and Medium Elasticity on the Velocity Autocorrelation Function

    PubMed Central

    Weber, Stephanie C.; Thompson, Michael A.; Moerner, W.E.; Spakowitz, Andrew J.; Theriot, Julie A.

    2012-01-01

    Single particle tracking is a powerful technique for investigating the dynamic behavior of biological molecules. However, many of the analytical tools are prone to generate results that can lead to mistaken interpretations of the underlying transport process. Here, we explore the effects of localization error and confinement on the velocity autocorrelation function, Cυ. We show that calculation of Cυ across a range of discretizations can distinguish the effects of localization error, confinement, and medium elasticity. Thus, under certain regimes, Cυ can be used as a diagnostic tool to identify the underlying mechanism of anomalous diffusion. Finally, we apply our analysis to experimental data sets of chromosomal loci and RNA-protein particles in Escherichia coli. PMID:22713559

  7. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  8. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data.

  9. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data.

  10. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  11. Electrochemical treatment of olive mill wastewater: treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools.

    PubMed

    Belaid, Chokri; Khadraoui, Moncef; Mseddii, Salma; Kallel, Monem; Elleuch, Boubaker; Fauvarque, Jean Frangois

    2013-01-01

    Problems related with industrials effluents can be divided in two parts: (1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes. This investigation deals with these two aspects, an electrochemical treatment method of an olive mill wastewater (OMW) under platinized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination. The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution. Indeed, 87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation. Moreover, 55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced. On the other hand, UV-Visible spectrophotometry, Gaz chromatography/mass spectrometry, cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR) showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW. It was concluded that electrochemical oxidation in a modified Grignard reactor is a promising process for the destruction of all phenolic compounds present in OMW. Among the monitoring analytical tools applied, cyclic voltammetry and 13C NMR a re among th e techniques that are introduced for thefirst time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  12. Analytical continuation in physical geodesy constructed by means of tools and formulas related to an ellipsoid of revolution

    NASA Astrophysics Data System (ADS)

    Holota, Petr; Nesvadba, Otakar

    2014-05-01

    In physical geodesy mathematical tools applied for solving problems of potential theory are often essentially associated with the concept of the so-called spherical approximation (interpreted as a mapping). The same holds true for the method of analytical (harmonic) continuation which is frequently considered as a means suitable for converting the ground gravity anomalies or disturbances to corresponding values on the level surface that is close to the original boundary. In the development and implementation of this technique the key role has the representation of a harmonic function by means of the famous Poisson's formula and the construction of a radial derivative operator on the basis of this formula. In this contribution an attempt is made to avoid spherical approximation mentioned above and to develop mathematical tools that allow implementation of the concept of analytical continuation also in a more general case, in particular for converting the ground gravity anomalies or disturbances to corresponding values on the surface of an oblate ellipsoid of revolution. The respective integral kernels are constructed with the aid of series of ellipsoidal harmonics and their summation, but also the mathematical nature of the boundary date is discussed in more details.

  13. Towards a minimally invasive sampling tool for high resolution tissue analytical mapping

    NASA Astrophysics Data System (ADS)

    Gottardi, R.

    2015-09-01

    Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.

  14. ELISA and GC-MS as Teaching Tools in the Undergraduate Environmental Analytical Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Wilson, Ruth I.; Mathers, Dan T.; Mabury, Scott A.; Jorgensen, Greg M.

    2000-12-01

    An undergraduate experiment for the analysis of potential water pollutants is described. Students are exposed to two complementary techniques, ELISA and GC-MS, for the analysis of a water sample containing atrazine, desethylatrazine, and simazine. Atrazine was chosen as the target analyte because of its wide usage in North America and its utility for students to predict environmental degradation products. The water sample is concentrated using solid-phase extraction for GC-MS, or diluted and analyzed using a competitive ELISA test kit for atrazine. The nature of the water sample is such that students generally find that ELISA gives an artificially high value for the concentration of atrazine. Students gain an appreciation for problems associated with measuring pollutants in the aqueous environment: sensitivity, accuracy, precision, and ease of analysis. This undergraduate laboratory provides an opportunity for students to learn several new analysis and sample preparation techniques and to critically evaluate these methods in terms of when they are most useful.

  15. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  16. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  17. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  18. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1

  19. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  20. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  1. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  2. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    SciTech Connect

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O'Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of

  3. Mineotaur: a tool for high-content microscopy screen sharing and visual analytics.

    PubMed

    Antal, Bálint; Chessel, Anatole; Carazo Salas, Rafael E

    2015-01-01

    High-throughput/high-content microscopy-based screens are powerful tools for functional genomics, yielding intracellular information down to the level of single-cells for thousands of genotypic conditions. However, accessing their data requires specialized knowledge and most often that data is no longer analyzed after initial publication. We describe Mineotaur ( http://www.mineotaur.org ), a open-source, downloadable web application that allows easy online sharing and interactive visualisation of large screen datasets, facilitating their dissemination and further analysis, and enhancing their impact.

  4. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  5. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future.

  6. A Cognitive Tool for Teaching the Addition/Subtraction of Common Fractions: A Model of Affordances

    ERIC Educational Resources Information Center

    Kong, Siu Cheung; Kwok, Lam For

    2005-01-01

    The aim of this research is to devise a cognitive tool for meeting the diverse needs of learners for comprehending new procedural knowledge. A model of affordances on teaching fraction equivalence for developing procedural knowledge for adding/subtracting fractions with unlike denominators was derived from the results of a case study of an initial…

  7. Magnetic optical sensor particles: a flexible analytical tool for microfluidic devices.

    PubMed

    Ungerböck, Birgit; Fellinger, Siegfried; Sulzer, Philipp; Abel, Tobias; Mayr, Torsten

    2014-05-21

    In this study we evaluate magnetic optical sensor particles (MOSePs) with incorporated sensing functionalities regarding their applicability in microfluidic devices. MOSePs can be separated from the surrounding solution to form in situ sensor spots within microfluidic channels, while read-out is accomplished outside the chip. These magnetic sensor spots exhibit benefits of sensor layers (high brightness and convenient usage) combined with the advantages of dispersed sensor particles (ease of integration). The accumulation characteristics of MOSePs with different diameters were investigated as well as the in situ sensor spot stability at varying flow rates. Magnetic sensor spots were stable at flow rates specific to microfluidic applications. Furthermore, MOSePs were optimized regarding fiber optic and imaging read-out systems, and different referencing schemes were critically discussed on the example of oxygen sensors. While the fiber optic sensing system delivered precise and accurate results for measurement in microfluidic channels, limitations due to analyte consumption were found for microscopic oxygen imaging. A compensation strategy is provided, which utilizes simple pre-conditioning by exposure to light. Finally, new application possibilities were addressed, being enabled by the use of MOSePs. They can be used for microscopic oxygen imaging in any chip with optically transparent covers, can serve as flexible sensor spots to monitor enzymatic activity or can be applied to form fixed sensor spots inside microfluidic structures, which would be inaccessible to integration of sensor layers.

  8. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    PubMed

    Zhao, Huaying; Casillas, Ernesto; Shroff, Hari; Patterson, George H; Schuck, Peter

    2013-01-01

    Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  9. An analytical tool-box for comprehensive biochemical, structural and transcriptome evaluation of oral biofilms mediated by mutans streptococci.

    PubMed

    Klein, Marlise I; Xiao, Jin; Heydorn, Arne; Koo, Hyun

    2011-01-25

    Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition (1, 2). In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci (3). The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate (3). Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and

  10. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  11. The management and exploitation of naturally light-emitting bacteria as a flexible analytical tool: A tutorial.

    PubMed

    Bolelli, L; Ferri, E N; Girotti, S

    2016-08-31

    Conventional detection of toxic contaminants on surfaces, in food, and in the environment takes time. Current analytical approaches to chemical detection can be of limited utility due to long detection times, high costs, and the need for a laboratory and trained personnel. A non-specific but easy, rapid, and inexpensive screening test can be useful to quickly classify a specimen as toxic or non toxic, so prompt appropriate measures can be taken, exactly where required. The bioluminescent bacteria-based tests meet all these characteristics. Bioluminescence methods are extremely attractive because of their high sensitivity, speed, ease of implementation, and statistical significance. They are usually sensitive enough to detect the majority of pollutants toxic to humans and mammals. This tutorial provides practical guidelines for isolating, cultivating, and exploiting marine bioluminescent bacteria as a simple and versatile analytical tool. Although mostly applied for aqueous phase sample and organic extracts, the test can also be conducted directly on soil and sediment samples so as to reflect the true toxicity due to the bioavailability fraction. Because tests can be performed with freeze-dried cell preparations, they could make a major contribution to field screening activity. They can be easily conducted in a mobile environmental laboratory and may be adaptable to miniaturized field instruments and field test kits. PMID:27506340

  12. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  13. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  14. Twenty-one years of microemulsion electrokinetic chromatography (1991-2012): a powerful analytical tool.

    PubMed

    Yang, Hua; Ding, Yao; Cao, Jun; Li, Ping

    2013-05-01

    Microemulsion electrokinetic chromatography (MEEKC) is a CE separation technique, which utilizes buffered microemulsions as the separation media. In the past two decades, MEEKC has blossomed into a powerful separation technique for the analysis of a wide range of compounds. Pseudostationary phase composition is so critical to successful resolution in EKC, and several variables could be optimized including surfactant/co-surfactant/oil type and concentration, buffer content, and pH value. Additionally, MEEKC coupled with online sample preconcentration approaches could significantly improve the detection sensitivity. This review comprehensively describes the development of MEEKC from the period 1991 to 2012. Areas covered include basic theory, microemulsion composition, improving resolution and enhancing sensitivity methods, detection techniques, and applications of MEEKC. PMID:23463608

  15. Femtosecond pulse shaping as analytic tool in mass spectrometry of complex polyatomic systems

    NASA Astrophysics Data System (ADS)

    Laarmann, Tim; Shchatsinin, Ihar; Singh, Pushkar; Zhavoronkov, Nickolai; Schulz, Claus Peter; Hertel, Ingolf Volker

    2008-04-01

    An additional dimension to mass spectrometric studies on building blocks of proteins is discussed in this paper. The present approach is based on tailored femtosecond laser pulses, using the concept of strong-field pulse shaping in an adaptive feedback loop. We show that control strategies making use of coherent properties of the electromagnetic wave allow one to break pre-selected backbone bonds in amino acid complexes that may be regarded as peptide model systems. Studies on different chromophores, such as phenylalanine and alanine, while keeping the backbone structure unchanged elucidates the effect of the excitation dynamics on the relaxation pathways. The observation of protonated species in the corresponding mass spectra indicates that optimal control of ultrafast laser pulses may even be useful to study intramolecular reactions such as hydrogen- or proton-transfer in particular cases. This opens new perspectives for biophysical and biochemical research, since these photochemical reactions are suggested to explain, e.g. photostability of DNA.

  16. Analytical speciation as a tool to assess arsenic behaviour in soils polluted by mining.

    PubMed

    Ruiz-Chancho, M J; López-Sánchez, J F; Rubio, R

    2007-01-01

    A study is performed to evaluate the occurrence of arsenic in polluted soils using acidic extractions and liquid chromatography-hydride generation-atomic fluorescence spectrometry (LC-HG-AFS) for speciation analysis. Seven soil samples were collected in an abandoned area polluted by mining in the Eastern Pyrenees (Spain), and two uncontaminated soils were taken for reference purposes. Moreover, the total arsenic content is evaluated in two different sieved fractions in order to obtain information on the possible particle-size-dependent association of arsenic with soil components. Soil samples were extracted with both phosphoric and ascorbic acids and the stabilities of the extracted species were studied. The arsenic species were determined by LC-HG-AFS. In addition, the ability of soil grinding to effect species change is also assessed. Arsenite and arsenate were found in the polluted soils, but only arsenate was found in the unpolluted soils. The quality of the results was assessed through a mass balance calculation and by analysing two soil Certified Reference Materials. Valuable information regarding arsenic occurrence in the studied soils is obtained from the speciation results. The presence of arsenite in the extracts can be attributed to arsenopyrite residues, whereas the presence of arsenate indicates release from weathered material. PMID:17171341

  17. Tactics for modeling multiple salivary analyte data in relation to behavior problems: Additive, ratio, and interaction effects.

    PubMed

    Chen, Frances R; Raine, Adrian; Granger, Douglas A

    2015-01-01

    Individual differences in the psychobiology of the stress response have been linked to behavior problems in youth yet most research has focused on single signaling molecules released by either the hypothalamic-pituitary-adrenal axis or the autonomic nervous system. As our understanding about biobehavioral relationships develops it is clear that multiple signals from the biological stress systems work in coordination to affect behavior problems. Questions are raised as to whether coordinated effects should be statistically represented as ratio or interactive terms. We address this knowledge gap by providing a theoretical overview of the concepts and rationales, and illustrating the analytical tactics. Salivary samples collected from 446 youth aged 11-12 were assayed for salivary alpha-amylase (sAA), dehydroepiandrosterone-sulfate (DHEA-s) and cortisol. Coordinated effect of DHEA-s and cortisol, and coordinated effect of sAA and cortisol on externalizing and internalizing problems (Child Behavior Checklist) were tested with the ratio and the interaction approaches using multi-group path analysis. Findings consistent with previous studies include a positive association between cortisol/DHEA-s ratio and internalizing problems; and a negative association between cortisol and externalizing problems conditional on low levels of sAA. This study highlights the importance of matching analytical strategy with research hypothesis when integrating salivary bioscience into research in behavior problems. Recommendations are made for investigating multiple salivary analytes in relation to behavior problems. PMID:25462892

  18. Tactics for modeling multiple salivary analyte data in relation to behavior problems: Additive, ratio, and interaction effects.

    PubMed

    Chen, Frances R; Raine, Adrian; Granger, Douglas A

    2015-01-01

    Individual differences in the psychobiology of the stress response have been linked to behavior problems in youth yet most research has focused on single signaling molecules released by either the hypothalamic-pituitary-adrenal axis or the autonomic nervous system. As our understanding about biobehavioral relationships develops it is clear that multiple signals from the biological stress systems work in coordination to affect behavior problems. Questions are raised as to whether coordinated effects should be statistically represented as ratio or interactive terms. We address this knowledge gap by providing a theoretical overview of the concepts and rationales, and illustrating the analytical tactics. Salivary samples collected from 446 youth aged 11-12 were assayed for salivary alpha-amylase (sAA), dehydroepiandrosterone-sulfate (DHEA-s) and cortisol. Coordinated effect of DHEA-s and cortisol, and coordinated effect of sAA and cortisol on externalizing and internalizing problems (Child Behavior Checklist) were tested with the ratio and the interaction approaches using multi-group path analysis. Findings consistent with previous studies include a positive association between cortisol/DHEA-s ratio and internalizing problems; and a negative association between cortisol and externalizing problems conditional on low levels of sAA. This study highlights the importance of matching analytical strategy with research hypothesis when integrating salivary bioscience into research in behavior problems. Recommendations are made for investigating multiple salivary analytes in relation to behavior problems.

  19. The backscatter electron signal as an additional tool for phase segmentation in electron backscatter diffraction.

    PubMed

    Payton, E J; Nolze, G

    2013-08-01

    The advent of simultaneous energy dispersive X-ray spectroscopy (EDS) data collection has vastly improved the phase separation capabilities for electron backscatter diffraction (EBSD) mapping. A major problem remains, however, in distinguishing between multiple cubic phases in a specimen, especially when the compositions of the phases are similar or their particle sizes are small, because the EDS interaction volume is much larger than that of EBSD and the EDS spectra collected during spatial mapping are generally noisy due to time limitations and the need to minimize sample drift. The backscatter electron (BSE) signal is very sensitive to the local composition due to its atomic number (Z) dependence. BSE imaging is investigated as a complimentary tool to EDS to assist phase segmentation and identification in EBSD through examination of specimens of meteorite, Cu dross, and steel oxidation layers. The results demonstrate that the simultaneous acquisition of EBSD patterns, EDS spectra, and the BSE signal can provide new potential for advancing multiphase material characterization in the scanning electron microscope. PMID:23575349

  20. The role of methanol addition to water samples in reducing analyte adsorption and matrix effects in liquid chromatography-tandem mass spectrometry.

    PubMed

    Li, Wei; Liu, Yucan; Duan, Jinming; Saint, Christopher P; Mulcahy, Dennis

    2015-04-10

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) analysis coupled simply with water filtering before injection has proven to be a simple, economic and time-saving method for analyzing trace-level organic pollutants in aqueous environments. However, the linearity, precision and detection limits of such methods for late-eluting analytes were found to be much poorer than for early-eluting ones due to adsorption of the analytes in the operating system, such as sample vial, flow path and sample loop, creating problems in quantitative analysis. Addition of methanol (MeOH) into water samples as a modifier was shown to be effective in alleviating or even eliminating the negative effect on signal intensity for the late-eluting analytes and at the same time being able to reduce certain matrix effects for real water samples. Based on the maximum detection signal intensity obtained on desorption of the analytes with MeOH addition, the ratio of the detection signal intensity without addition of MeOH to the maximum intensity can be used to evaluate the effectiveness of methanol addition. Accordingly, the values of <50%, 50-80%, 80-120% could be used to indicate strong, medium and no effects, respectively. Based on this concept, an external matrix-matched calibration method with the addition of MeOH has been successfully established for analyzing fifteen pesticides with diverse physico-chemical properties in surface and groundwater with good linearity (r(2): 0.9929-0.9996), precision (intra-day relative standard deviation (RSD): 1.4-10.7%, inter-day RSD: 1.5-9.4%), accuracy (76.9-126.7%) and low limits of detection (0.003-0.028μg/L).

  1. Alerting strategies in computerized physician order entry: a novel use of a dashboard-style analytics tool in a children's hospital.

    PubMed

    Reynolds, George; Boyer, Dean; Mackey, Kevin; Povondra, Lynne; Cummings, Allana

    2008-01-01

    Utilizing a commercially available business analytics tool offering dashboard-style graphical indicators and a data warehouse strategy, we have developed an interactive, web-based platform that allows near-real-time analysis of CPOE adoption by hospital area and practitioner specialty. Clinical Decision Support (CDS) metrics include the percentage of alerts that result in a change in clinician decision-making. This tool facilitates adjustments in alert limits in order to reduce alert fatigue.

  2. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  3. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  4. Recent Additions in the Modeling Capabilities of an Open-Source Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-04-20

    WEC-Sim is a midfidelity numerical tool for modeling wave energy conversion devices. The code uses the MATLAB SimMechanics package to solve multibody dynamics and models wave interactions using hydrodynamic coefficients derived from frequency-domain boundary-element methods. This paper presents the new modeling features introduced in the latest release of WEC-Sim. The first feature discussed conversion of the fluid memory kernel to a state-space form. This enhancement offers a substantial computational benefit after the hydrodynamic body-to-body coefficients are introduced and the number of interactions increases exponentially with each additional body. Additional features include the ability to calculate the wave-excitation forces based on the instantaneous incident wave angle, allowing the device to weathervane, as well as import a user-defined wave elevation time series. A review of the hydrodynamic theory for each feature is provided and the successful implementation is verified using test cases.

  5. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    PubMed

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  6. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    PubMed

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS.

  7. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    PubMed

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS. PMID:27387996

  8. Additional disturbances as a beneficial tool for restoration of post-mining sites: a multi-taxa approach.

    PubMed

    Řehounková, Klára; Čížek, Lukáš; Řehounek, Jiří; Šebelíková, Lenka; Tropek, Robert; Lencová, Kamila; Bogusch, Petr; Marhoul, Pavel; Máca, Jan

    2016-07-01

    Open interior sands represent a highly threatened habitat in Europe. In recent times, their associated organisms have often found secondary refuges outside their natural habitats, mainly in sand pits. We investigated the effects of different restoration approaches, i.e. spontaneous succession without additional disturbances, spontaneous succession with additional disturbances caused by recreational activities, and forestry reclamation, on the diversity and conservation values of spiders, beetles, flies, bees and wasps, orthopterans and vascular plants in a large sand pit in the Czech Republic, Central Europe. Out of 406 species recorded in total, 112 were classified as open sand specialists and 71 as threatened. The sites restored through spontaneous succession with additional disturbances hosted the largest proportion of open sand specialists and threatened species. The forestry reclamations, in contrast, hosted few such species. The sites with spontaneous succession without disturbances represent a transition between these two approaches. While restoration through spontaneous succession favours biodiversity in contrast to forestry reclamation, additional disturbances are necessary to maintain early successional habitats essential for threatened species and open sand specialists. Therefore, recreational activities seem to be an economically efficient restoration tool that will also benefit biodiversity in sand pits.

  9. Additional disturbances as a beneficial tool for restoration of post-mining sites: a multi-taxa approach.

    PubMed

    Řehounková, Klára; Čížek, Lukáš; Řehounek, Jiří; Šebelíková, Lenka; Tropek, Robert; Lencová, Kamila; Bogusch, Petr; Marhoul, Pavel; Máca, Jan

    2016-07-01

    Open interior sands represent a highly threatened habitat in Europe. In recent times, their associated organisms have often found secondary refuges outside their natural habitats, mainly in sand pits. We investigated the effects of different restoration approaches, i.e. spontaneous succession without additional disturbances, spontaneous succession with additional disturbances caused by recreational activities, and forestry reclamation, on the diversity and conservation values of spiders, beetles, flies, bees and wasps, orthopterans and vascular plants in a large sand pit in the Czech Republic, Central Europe. Out of 406 species recorded in total, 112 were classified as open sand specialists and 71 as threatened. The sites restored through spontaneous succession with additional disturbances hosted the largest proportion of open sand specialists and threatened species. The forestry reclamations, in contrast, hosted few such species. The sites with spontaneous succession without disturbances represent a transition between these two approaches. While restoration through spontaneous succession favours biodiversity in contrast to forestry reclamation, additional disturbances are necessary to maintain early successional habitats essential for threatened species and open sand specialists. Therefore, recreational activities seem to be an economically efficient restoration tool that will also benefit biodiversity in sand pits. PMID:27053054

  10. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    SciTech Connect

    Bjoerklund, Anna

    2012-01-15

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: Black-Right-Pointing-Pointer LCA was explored as analytical tool in an SEA process of municipal energy planning. Black-Right-Pointing-Pointer The process also integrated LCA with scenario planning and public participation. Black-Right-Pointing-Pointer Benefits of using LCA were a systematic framework and wider systems perspective. Black-Right-Pointing-Pointer Integration of tools required some methodological challenges to be solved. Black-Right-Pointing-Pointer This proved an innovative approach to define alternatives and scope of assessment.

  11. An analytical approach to the problem of inverse optimization with additive objective functions: an application to human prehension

    PubMed Central

    Pesin, Yakov B.; Niu, Xun; Latash, Mark L.

    2010-01-01

    We consider the problem of what is being optimized in human actions with respect to various aspects of human movements and different motor tasks. From the mathematical point of view this problem consists of finding an unknown objective function given the values at which it reaches its minimum. This problem is called the inverse optimization problem. Until now the main approach to this problems has been the cut-and-try method, which consists of introducing an objective function and checking how it reflects the experimental data. Using this approach, different objective functions have been proposed for the same motor action. In the current paper we focus on inverse optimization problems with additive objective functions and linear constraints. Such problems are typical in human movement science. The problem of muscle (or finger) force sharing is an example. For such problems we obtain sufficient conditions for uniqueness and propose a method for determining the objective functions. To illustrate our method we analyze the problem of force sharing among the fingers in a grasping task. We estimate the objective function from the experimental data and show that it can predict the force-sharing pattern for a vast range of external forces and torques applied to the grasped object. The resulting objective function is quadratic with essentially non-zero linear terms. PMID:19902213

  12. Variance decomposition: a tool enabling strategic improvement of the precision of analytical recovery and concentration estimates associated with microorganism enumeration methods.

    PubMed

    Schmidt, P J; Emelko, M B; Thompson, M E

    2014-05-15

    Concentrations of particular types of microorganisms are commonly measured in various waters, yet the accuracy and precision of reported microorganism concentration values are often questioned due to the imperfect analytical recovery of quantitative microbiological methods and the considerable variation among fully replicated measurements. The random error in analytical recovery estimates and unbiased concentration estimates may be attributable to several sources, and knowing the relative contribution from each source can facilitate strategic design of experiments to yield more precise data or provide an acceptable level of information with fewer data. Herein, variance decomposition using the law of total variance is applied to previously published probabilistic models to explore the relative contributions of various sources of random error and to develop tools to aid experimental design. This work focuses upon enumeration-based methods with imperfect analytical recovery (such as enumeration of Cryptosporidium oocysts), but the results also yield insights about plating methods and microbial methods in general. Using two hypothetical analytical recovery profiles, the variance decomposition method is used to explore 1) the design of an experiment to quantify variation in analytical recovery (including the size and precision of seeding suspensions and the number of samples), and 2) the design of an experiment to estimate a single microorganism concentration (including sample volume, effects of improving analytical recovery, and replication). In one illustrative example, a strategically designed analytical recovery experiment with 6 seeded samples would provide as much information as an alternative experiment with 15 seeded samples. Several examples of diminishing returns are illustrated to show that efforts to reduce error in analytical recovery and concentration estimates can have negligible effect if they are directed at trivial error sources.

  13. Cisapride a green analytical reagent for rapid and sensitive determination of bromate in drinking water, bread and flour additives by oxidative coupling spectrophotometric methods.

    PubMed

    Al Okab, Riyad Ahmed

    2013-02-15

    Green analytical methods using Cisapride (CPE) as green analytical reagent was investigated in this work. Rapid, simple, and sensitive spectrophotometric methods for the determination of bromate in water sample, bread and flour additives were developed. The proposed methods based on the oxidative coupling between phenoxazine and Cisapride in the presence of bromate to form red colored product with max at 520 nm. Phenoxazine and Cisapride and its reaction products were found to be environmentally friendly under the optimum experimental condition. The method obeys beers law in concentration range 0.11-4.00 g ml(-1) and molar absorptivity 1.41 × 10(4) L mol(-1)cm(-1). All variables have been optimized and the presented reaction sequences were applied to the analysis of bromate in water, bread and flour additive samples. The performance of these method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference method. The combination of pharmaceutical drugs reagents with low concentration create some unique green chemical analyses.

  14. Cisapride a green analytical reagent for rapid and sensitive determination of bromate in drinking water, bread and flour additives by oxidative coupling spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Al Okab, Riyad Ahmed

    2013-02-01

    Green analytical methods using Cisapride (CPE) as green analytical reagent was investigated in this work. Rapid, simple, and sensitive spectrophotometric methods for the determination of bromate in water sample, bread and flour additives were developed. The proposed methods based on the oxidative coupling between phenoxazine and Cisapride in the presence of bromate to form red colored product with max at 520 nm. Phenoxazine and Cisapride and its reaction products were found to be environmentally friendly under the optimum experimental condition. The method obeys beers law in concentration range 0.11-4.00 g ml-1 and molar absorptivity 1.41 × 104 L mol-1 cm-1. All variables have been optimized and the presented reaction sequences were applied to the analysis of bromate in water, bread and flour additive samples. The performance of these method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference method. The combination of pharmaceutical drugs reagents with low concentration create some unique green chemical analyses.

  15. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  16. The modified ultrasound pattern sum score mUPSS as additional diagnostic tool for genetically distinct hereditary neuropathies.

    PubMed

    Grimm, Alexander; Rasenack, Maria; Athanasopoulou, Ioanna M; Dammeier, Nele Maria; Lipski, Christina; Wolking, Stefan; Vittore, Debora; Décard, Bernhard F; Axer, Hubertus

    2016-02-01

    The objective of this study is to evaluate the nerve ultrasound characteristics in genetically distinct inherited neuropathies, the value of the modified ultrasound pattern sum score (mUPSS) to differentiate between the subtypes and the correlation of ultrasound with nerve conduction studies (NCS), disease duration and severity. All patients underwent a standardized neurological examination, ultrasound, and NCS. In addition, genetic testing was performed. Consequently, mUPSS was applied, which is a sum-score of cross-sectional areas (CSA) at predefined anatomical points in different nerves. 31 patients were included (10xCharcot-Marie-Tooth (CMT)1a, 3xCMT1b, 3xCMTX, 9xCMT2, 6xHNPP [Hereditary neuropathy with liability to pressure palsies]). Generalized, homogeneous nerve enlargement and significantly increased UPS scores emphasized the diagnosis of demyelinating neuropathy, particularly CMT1a and CMT1b. The amount of enlargement did not depend on disease duration, symptom severity, height and weight. In CMTX the nerves were enlarged, as well, however, only in the roots and lower limbs, most prominent in men. In CMT2 no significant enlargement was detectable. In HNPP the CSA values were increased at entrapped sites, and not elsewhere. However, a distinction from CMT1, which also showed enlarged CSA values at entrapment sites, was only possible by calculating the entrapment ratios and entrapment score. The mUPSS allowed distinction between CMT1a (increased UPS scores, entrapment ratios <1.0) and HNPP (low UPS scores, entrapment ratios >1.4), while CMT1b and CMTX showed intermediate UPS types and entrapment ratios <1.0. Although based on few cases, ultrasound revealed consistent and homogeneous nerve alteration in certain inherited neuropathies. The modified UPSS is a quantitative tool, which may provide useful information for diagnosis, differentiation and follow-up evaluation in addition to NCS and molecular testing.

  17. A Serious Videogame as an Additional Therapy Tool for Training Emotional Regulation and Impulsivity Control in Severe Gambling Disorder

    PubMed Central

    Tárrega, Salomé; Castro-Carreras, Laia; Fernández-Aranda, Fernando; Granero, Roser; Giner-Bartolomé, Cristina; Aymamí, Neus; Gómez-Peña, Mónica; Santamaría, Juan J.; Forcano, Laura; Steward, Trevor; Menchón, José M.; Jiménez-Murcia, Susana

    2015-01-01

    Background: Gambling disorder (GD) is characterized by a significant lack of self-control and is associated with impulsivity-related personality traits. It is also linked to deficits in emotional regulation and frequently co-occurs with anxiety and depression symptoms. There is also evidence that emotional dysregulation may play a mediatory role between GD and psychopathological symptomatology. Few studies have reported the outcomes of psychological interventions that specifically address these underlying processes. Objectives: To assess the utility of the Playmancer platform, a serious video game, as an additional therapy tool in a CBT intervention for GD, and to estimate pre-post changes in measures of impulsivity, anger expression and psychopathological symptomatology. Method: The sample comprised a single group of 16 male treatment-seeking individuals with severe GD diagnosis. Therapy intervention consisted of 16 group weekly CBT sessions and, concurrently, 10 additional weekly sessions of a serious video game. Pre-post treatment scores on South Oaks Gambling Screen (SOGS), Barratt Impulsiveness Scale (BIS-11), I7 Impulsiveness Questionnaire (I7), State-Trait Anger Expression Inventory 2 (STAXI-2), Symptom Checklist-Revised (SCL-90-R), State-Trait Anxiety Inventory (STAI-S-T), and Novelty Seeking from the Temperament and Character Inventory-Revised (TCI-R) were compared. Results: After the intervention, significant changes were observed in several measures of impulsivity, anger expression and other psychopathological symptoms. Dropout and relapse rates during treatment were similar to those described in the literature for CBT. Conclusion: Complementing CBT interventions for GD with a specific therapy approach like a serious video game might be helpful in addressing certain underlying factors which are usually difficult to change, including impulsivity and anger expression. PMID:26617550

  18. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What analytical...

  19. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What analytical...

  20. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What analytical...

  1. Additive technology of soluble mold tooling for embedded devices in composite structures: A study on manufactured tolerances

    NASA Astrophysics Data System (ADS)

    Roy, Madhuparna

    Composite textiles have found widespread use and advantages in various industries and applications. The constant demand for high quality products and services requires companies to minimize their manufacturing costs, and delivery time in order to compete in general and niche marketplaces. Advanced manufacturing methods aim to provide economical methods of mold production. Creation of molding and tooling options for advanced composites encompasses a large portion of the fabrication time, making it a costly process and restraining factor. This research discusses a preliminary investigation into the use of soluble polymer compounds and additive manufacturing to fabricate soluble molds. These molds suffer from dimensional errors due to several factors, which have also been characterized. The basic soluble mold of a composite is 3D printed to meet the desired dimensions and geometry of holistic structures or spliced components. The time taken to dissolve the mold depends on the rate of agitation of the solvent. This process is steered towards enabling the implantation of optoelectronic devices within the composite to provide sensing capability for structural health monitoring. The shape deviation of the 3D printed mold is also studied and compared to its original dimensions to optimize the dimensional quality to produce dimensionally accurate parts. Mechanical tests were performed on compact tension (CT) resin samples prepared from these 3D printed molds and revealed crack propagation towards an embedded intact optical fiber.

  2. Demonstration of the Recent Additions in Modeling Capabilities for the WEC-Sim Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-03-01

    WEC-Sim is a mid-fidelity numerical tool for modeling wave energy conversion (WEC) devices. The code uses the MATLAB SimMechanics package to solve the multi-body dynamics and models the wave interactions using hydrodynamic coefficients derived from frequency domain boundary element methods. In this paper, the new modeling features introduced in the latest release of WEC-Sim will be presented. The first feature discussed is the conversion of the fluid memory kernel to a state-space approximation that provides significant gains in computational speed. The benefit of the state-space calculation becomes even greater after the hydrodynamic body-to-body coefficients are introduced as the number of interactions increases exponentially with the number of floating bodies. The final feature discussed is the capability toadd Morison elements to provide additional hydrodynamic damping and inertia. This is generally used as a tuning feature, because performance is highly dependent on the chosen coefficients. In this paper, a review of the hydrodynamic theory for each of the features is provided and successful implementation is verified using test cases.

  3. Building Adoption of Visual Analytics Software

    SciTech Connect

    Chinchor, Nancy; Cook, Kristin A.; Scholtz, Jean

    2012-01-05

    Adoption of technology is always difficult. Issues such as having the infrastructure necessary to support the technology, training for users, integrating the technology into current processes and tools, and having the time, managerial support, and necessary funds need to be addressed. In addition to these issues, the adoption of visual analytics tools presents specific challenges that need to be addressed. This paper discusses technology adoption challenges and approaches for visual analytics technologies.

  4. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  5. ANALYTICAL TOOL INTERFACE FOR LANDSCAPE ASSESSMENTS (ATIILA): AN ARCVIEW EXTENSION FOR THE ANALYSIS OF LANDSCAPE PATTERNS, COMPOSITION, AND STRUCTURE

    EPA Science Inventory

    Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...

  6. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  7. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    ERIC Educational Resources Information Center

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  8. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Safety, NEPA 101A) should be used to support the life safety equivalency evaluation. If fire modeling is... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and...

  9. Exploration and classification of chromatographic fingerprints as additional tool for identification and quality control of several Artemisia species.

    PubMed

    Alaerts, Goedele; Pieters, Sigrid; Logie, Hans; Van Erps, Jürgen; Merino-Arévalo, Maria; Dejaegher, Bieke; Smeyers-Verbeke, Johanna; Vander Heyden, Yvan

    2014-07-01

    The World Health Organization accepts chromatographic fingerprints as a tool for identification and quality control of herbal medicines. This is the first study in which the distinction, identification and quality control of four different Artemisia species, i.e. Artemisia vulgaris, A. absinthium, A. annua and A. capillaris samples, is performed based on the evaluation of entire chromatographic fingerprint profiles developed with identical experimental conditions. High-Performance Liquid Chromatography (HPLC) with Diode Array Detection (DAD) was used to develop the fingerprints. Application of factorial designs leads to methanol/water (80:20 (v/v)) as the best extraction solvent for the pulverised plant material and to a shaking bath for 30 min as extraction method. Further, so-called screening, optimisation and fine-tuning phases were performed during fingerprint development. Most information about the different Artemisia species, i.e. the highest number of separated peaks in the fingerprint, was acquired on four coupled Chromolith columns (100 mm × 4.6 mm I.D.). Trifluoroacetic acid 0.05% (v/v) was used as mobile-phase additive in a stepwise linear methanol/water gradient, i.e. 5, 34, 41, 72 and 95% (v/v) methanol at 0, 9, 30, 44 and 51 min, where the last mobile phase composition was kept isocratic till 60 min. One detection wavelength was selected to perform data analysis. The lowest similarity between the fingerprints of the four species was present at 214 nm. The HPLC/DAD method was applied on 199 herbal samples of the four Artemisia species, resulting in 357 fingerprints. The within- and between-day variation of the entire method, as well as the quality control fingerprints obtained during routine analysis, were found acceptable. The distinction of these Artemisia species was evaluated based on the entire chromatographic profiles, developed by a shared method, and visualised in score plots by means of the Principal Component Analysis (PCA) exploratory data

  10. Peak-bridges due to in-column analyte transformations as a new tool for establishing molecular connectivities by comprehensive two-dimensional gas chromatography-mass spectrometry.

    PubMed

    Filippi, Jean-Jacques; Cocolo, Nicolas; Meierhenrich, Uwe J

    2015-02-27

    Comprehensive two-dimensional gas chromatography-mass spectrometry (GC×GC-MS) has been shown to permit for the unprecedented chromatographic resolution of volatile analytes encompassing various families of organic compounds. However, peak identification based on retention time, two-dimensional mapping, and mass spectrometric fragmentation only, is not a straightforward task yet. The possibility to establish molecular links between constituents is of crucial importance to understand the overall chemistry of any sample, especially in natural extracts where biogenetically related isomeric structures are often abundant. We here present a new way of using GC×GC that allows searching for those molecular connectivities. Analytical investigations of essential oil constituents by means of GC×GC-MS permitted to observe in real time the thermally-induced transformations of various sesquiterpenic derivatives. These transformations generated a series of well-defined two-dimensional peak bridges within the 2D-chromatograms connecting parent and daughter molecules, thus permitting to build a clear scheme of structural relationship between the different constituents. GC×GC-MS appears here as a tool for investigating chromatographic phenomena and analyte transformations that could not be understood with conventional GC-MS only. PMID:25622519

  11. Single-cell MALDI-MS as an analytical tool for studying intrapopulation metabolic heterogeneity of unicellular organisms.

    PubMed

    Amantonico, Andrea; Urban, Pawel L; Fagerer, Stephan R; Balabin, Roman M; Zenobi, Renato

    2010-09-01

    Heterogeneity is a characteristic feature of all populations of living organisms. Here we make an attempt to validate a single-cell mass spectrometric method for detection of changes in metabolite levels occurring in populations of unicellular organisms. Selected metabolites involved in central metabolism (ADP, ATP, GTP, and UDP-Glucose) could readily be detected in single cells of Closterium acerosum by means of negative-mode matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS). The analytical capabilities of this approach were characterized using standard compounds. The method was then used to study populations of individual cells with different levels of the chosen metabolites. With principal component analysis and support vector machine algorithms, it was possible to achieve a clear separation of individual C. acerosum cells in different metabolic states. This study demonstrates the suitability of mass spectrometric analysis of metabolites in single cells to measure cell-population heterogeneity.

  12. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  13. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide.

    PubMed

    Tawakkol, Shereen M; Farouk, M; Elaziz, Omar Abd; Hemdan, A; Shehata, Mostafa A

    2014-12-10

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  14. In situ protein secondary structure determination in ice: Raman spectroscopy-based process analytical tool for frozen storage of biopharmaceuticals.

    PubMed

    Roessl, Ulrich; Leitgeb, Stefan; Pieters, Sigrid; De Beer, Thomas; Nidetzky, Bernd

    2014-08-01

    A Raman spectroscopy-based method for in situ monitoring of secondary structural composition of proteins during frozen and thawed storage was developed. A set of reference proteins with different α-helix and β-sheet compositions was used for calibration and validation in a chemometric approach. Reference secondary structures were quantified with circular dichroism spectroscopy in the liquid state. Partial least squares regression models were established that enable estimation of secondary structure content from Raman spectra. Quantitative secondary structure determination in ice was accomplished for the first time and correlation with existing (qualitative) protein structural data from the frozen state was achieved. The method can be used in the presence of common stabilizing agents and is applicable in an industrial freezer setup. Raman spectroscopy represents a powerful, noninvasive, and flexibly applicable tool for protein stability monitoring during frozen storage.

  15. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  16. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  17. Dynamic 3D visual analytic tools: a method for maintaining situational awareness during high tempo warfare or mass casualty operations

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.

    2010-04-01

    Maintaining Situational Awareness (SA) is crucial to the success of high tempo operations, such as war fighting and mass casualty events (bioterrorism, natural disasters). Modern computer and software applications attempt to provide command and control manager's situational awareness via the collection, integration, interrogation and display of vast amounts of analytic data in real-time from a multitude of data sources and formats [1]. At what point does the data volume and displays begin to erode the hierarchical distributive intelligence, command and control structure of the operation taking place? In many cases, people tasked with making decisions, have insufficient experience in SA of high tempo operations and become overwhelmed easily as vast amounts of data begin to be displayed in real-time as an operation unfolds. In these situations, where data is plentiful and the relevance of the data changes rapidly, there is a chance for individuals to target fixate on those data sources they are most familiar. If these individuals fall into this type of pitfall, they will exclude other data that might be just as important to the success of the operation. To counter these issues, it is important that the computer and software applications provide a means for prompting its users to take notice of adverse conditions or trends that are critical to the operation. This paper will discuss a new method of displaying data called a Crisis ViewTM, that monitors critical variables that are dynamically changing and allows preset thresholds to be created to prompt the user when decisions need to be made and when adverse or positive trends are detected. The new method will be explained in basic terms, with examples of its attributes and how it can be implemented.

  18. Analytical electron microscopy and focused ion beam: complementary tool for the imaging of copper sorption onto iron oxide aggregates.

    PubMed

    Mavrocordatos, D; Steiner, M; Boller, M

    2003-04-01

    Nanometre-scale electron spectroscopic imaging has been applied to characterize the operation of a copper filtration plant in environmental science. Copper washed off from roofs and roads is considered to be a major contributor to diffuse copper pollution of urban environments. A special adsorber system has been suggested to control the diffusion of copper fluxes by retaining Cu with a granulated iron hydroxide. The adsorber was tested over an 18-month period on facade runoff. The concentrations range of Cu in the runoff water was measured between 10 and 1000 p.p.m. and could be reduced by between 96% and 99% in the adsorption ditch. Before the analysis of the adsorber, the suspended material from the inflow was ultracentrifuged onto TEM grids and analysed by energy-filtered transmission electron microscopy (EFTEM). Copper was found either as small precipitates 5-20 nm in size or adsorbed onto organic and inorganic particles. This Cu represents approximately 30% of the total dissolved Cu, measured by atomic emission spectrometry. To locate where the copper sorption takes place within the adsorber, the granulated iron oxide was analysed by analytical electron microscopy after exposure to the roof run-off water. A section of the granulated iron hydroxide was prepared by focused ion beam milling. The thickness of the lamina was reduced to 100 nm and analysed by EFTEM. The combination of these two techniques allowed us to observe the diffusion of Cu into the aggregate of Fe. Elemental maps of Fe and Cu revealed that copper was not only present at the surface of the granules but was also sorbed onto the fine particles inside the adsorber.

  19. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals. PMID:27429366

  20. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals.

  1. Spatial and Temporal Oxygen Dynamics in Macrofaunal Burrows in Sediments: A Review of Analytical Tools and Observational Evidence

    PubMed Central

    Satoh, Hisashi; Okabe, Satoshi

    2013-01-01

    The availability of benthic O2 plays a crucial role in benthic microbial communities and regulates many important biogeochemical processes. Burrowing activities of macrobenthos in the sediment significantly affect O2 distribution and its spatial and temporal dynamics in burrows, followed by alterations of sediment microbiology. Consequently, numerous research groups have investigated O2 dynamics in macrofaunal burrows. The introduction of powerful tools, such as microsensors and planar optodes, to sediment analysis has greatly enhanced our ability to measure O2 dynamics in burrows at high spatial and temporal resolution with minimal disturbance of the physical structure of the sediment. In this review, we summarize recent studies of O2-concentration measurements in burrows with O2 microsensors and O2 planar optodes. This manuscript mainly focuses on the fundamentals of O2 microsensors and O2 planar optodes, and their application in the direct measurement of the spatial and temporal dynamics of O2 concentrations in burrows, which have not previously been reviewed, and will be a useful supplement to recent literature reviews on O2 dynamics in macrofaunal burrows. PMID:23594972

  2. Adult stem cells: simply a tool for regenerative medicine or an additional piece in the puzzle of human aging?

    PubMed

    Tollervey, James R; Lunyak, Victoria V

    2011-12-15

    Adult stem cells have taken center stage in current research related to regenerative medicine and pharmacogenomic studies seeking new therapeutic interventions. As we learn more about these cells, it is becoming apparent that the next big leap in our understanding of adult stem cell biology and adult stem cell aging will depend on the integration of approaches from various disciplines. Major advances and technological breakthroughs at the crossroad of fields such as biomaterials, genomics, epigenomics, and proteomics will enable the design of better tools to model human diseases, and warrant safe usage of adult stem cells in the clinic.

  3. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  4. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  5. Indirect additive manufacturing as an elegant tool for the production of self-supporting low density gelatin scaffolds.

    PubMed

    Van Hoorick, Jasper; Declercq, Heidi; De Muynck, Amelie; Houben, Annemie; Van Hoorebeke, Luc; Cornelissen, Ria; Van Erps, Jürgen; Thienpont, Hugo; Dubruel, Peter; Van Vlierberghe, Sandra

    2015-10-01

    The present work describes for the first time the production of self-supporting low gelatin density (<10 w/v%) porous scaffolds using methacrylamide-modified gelatin as an extracellular matrix mimicking component. As porous scaffolds starting from low gelatin concentrations cannot be realized with the conventional additive manufacturing techniques in the abscence of additives, we applied an indirect fused deposition modelling approach. To realize this, we have printed a sacrificial polyester scaffold which supported the hydrogel material during UV crosslinking, thereby preventing hydrogel structure collapse. After complete curing, the polyester scaffold was selectively dissolved leaving behind a porous, interconnective low density gelatin scaffold. Scaffold structural analysis indicated the success of the selected indirect additive manufacturing approach. Physico-chemical testing revealed scaffold properties (mechanical, degradation, swelling) to depend on the applied gelatin concentration and methacrylamide content. Preliminary biocompatibility studies revealed the cell-interactive and biocompatible properties of the materials developed.

  6. Indirect additive manufacturing as an elegant tool for the production of self-supporting low density gelatin scaffolds.

    PubMed

    Van Hoorick, Jasper; Declercq, Heidi; De Muynck, Amelie; Houben, Annemie; Van Hoorebeke, Luc; Cornelissen, Ria; Van Erps, Jürgen; Thienpont, Hugo; Dubruel, Peter; Van Vlierberghe, Sandra

    2015-10-01

    The present work describes for the first time the production of self-supporting low gelatin density (<10 w/v%) porous scaffolds using methacrylamide-modified gelatin as an extracellular matrix mimicking component. As porous scaffolds starting from low gelatin concentrations cannot be realized with the conventional additive manufacturing techniques in the abscence of additives, we applied an indirect fused deposition modelling approach. To realize this, we have printed a sacrificial polyester scaffold which supported the hydrogel material during UV crosslinking, thereby preventing hydrogel structure collapse. After complete curing, the polyester scaffold was selectively dissolved leaving behind a porous, interconnective low density gelatin scaffold. Scaffold structural analysis indicated the success of the selected indirect additive manufacturing approach. Physico-chemical testing revealed scaffold properties (mechanical, degradation, swelling) to depend on the applied gelatin concentration and methacrylamide content. Preliminary biocompatibility studies revealed the cell-interactive and biocompatible properties of the materials developed. PMID:26411443

  7. Computer image analysis: an additional tool for the identification of processed poultry and mammal protein containing bones.

    PubMed

    Pinotti, L; Fearn, T; Gulalp, S; Campagnoli, A; Ottoboni, M; Baldi, A; Cheli, F; Savoini, G; Dell'Orto, V

    2013-01-01

    The aims of this study were (1) to evaluate the potential of image analysis measurements, in combination with the official analytical methods for the detection of constituents of animal origin in feedstuffs, to distinguish between poultry versus mammals; and (2) to identify possible markers that can be used in routine analysis. For this purpose, 14 mammal and seven poultry samples and a total of 1081 bone fragment lacunae were analysed by combining the microscopic methods with computer image analysis. The distribution of 30 different measured size and shape bone lacunae variables were studied both within and between the two zoological classes. In all cases a considerable overlap between classes meant that classification of individual lacunae was problematic, though a clear separation in the means did allow successful classification of samples on the basis of averages. The variables most useful for classification were those related to size, lacuna area for example. The approach shows considerable promise but will need further study using a larger number of samples with a wider range.

  8. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. PMID:20424421

  9. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine.

  10. Nematode.net update 2011: addition of data sets and tools featuring next-generation sequencing data.

    PubMed

    Martin, John; Abubucker, Sahar; Heizer, Esley; Taylor, Christina M; Mitreva, Makedonka

    2012-01-01

    Nematode.net (http://nematode.net) has been a publicly available resource for studying nematodes for over a decade. In the past 3 years, we reorganized Nematode.net to provide more user-friendly navigation through the site, a necessity due to the explosion of data from next-generation sequencing platforms. Organism-centric portals containing dynamically generated data are available for over 56 different nematode species. Next-generation data has been added to the various data-mining portals hosted, including NemaBLAST and NemaBrowse. The NemaPath metabolic pathway viewer builds associations using KOs, rather than ECs to provide more accurate and fine-grained descriptions of proteins. Two new features for data analysis and comparative genomics have been added to the site. NemaSNP enables the user to perform population genetics studies in various nematode populations using next-generation sequencing data. HelmCoP (Helminth Control and Prevention) as an independent component of Nematode.net provides an integrated resource for storage, annotation and comparative genomics of helminth genomes to aid in learning more about nematode genomes, as well as drug, pesticide, vaccine and drug target discovery. With this update, Nematode.net will continue to realize its original goal to disseminate diverse bioinformatic data sets and provide analysis tools to the broad scientific community in a useful and user-friendly manner.

  11. Challenges for Visual Analytics

    SciTech Connect

    Thomas, James J.; Kielman, Joseph

    2009-09-23

    Visual analytics has seen unprecedented growth in its first five years of mainstream existence. Great progress has been made in a short time, yet great challenges must be met in the next decade to provide new technologies that will be widely accepted by societies throughout the world. This paper sets the stage for some of those challenges in an effort to provide the stimulus for the research, both basic and applied, to address and exceed the envisioned potential for visual analytics technologies. We start with a brief summary of the initial challenges, followed by a discussion of the initial driving domains and applications, as well as additional applications and domains that have been a part of recent rapid expansion of visual analytics usage. We look at the common characteristics of several tools illustrating emerging visual analytics technologies, and conclude with the top ten challenges for the field of study. We encourage feedback and collaborative participation by members of the research community, the wide array of user communities, and private industry.

  12. Effect of Ti Addition on Carbide Modification and the Microscopic Simulation of Impact Toughness in High-Carbon Cr-V Tool Steels

    NASA Astrophysics Data System (ADS)

    Cho, Ki Sub; Kim, Sang Il; Park, Sung Soo; Choi, Won Suk; Moon, Hee Kwon; Kwon, Hoon

    2016-01-01

    In D7 tool steel, which contains high levels of primary carbides, the influence of carbide modification by Ti addition was quantitatively analyzed. Considering the Griffith-Irwin energy criterion for crack growth, the impact energy was evaluated by substituting a microscopic factor of the normalized number density of carbides cracked during hardness indentation tests for the crack length. The impact energy was enhanced with Ti addition because Ti reduced and refined the primary M7C3 carbide phase of elongated morphology, reducing the probability of crack generation.

  13. The Monte Carlo method as a tool for statistical characterisation of differential and additive phase shifting algorithms

    NASA Astrophysics Data System (ADS)

    Miranda, M.; Dorrío, B. V.; Blanco, J.; Diz-Bugarín, J.; Ribas, F.

    2011-01-01

    Several metrological applications base their measurement principle in the phase sum or difference between two patterns, one original s(r,phi) and another modified t(r,phi+Δphi). Additive or differential phase shifting algorithms directly recover the sum 2phi+Δphi or the difference Δphi of phases without requiring prior calculation of the individual phases. These algorithms can be constructed, for example, from a suitable combination of known phase shifting algorithms. Little has been written on the design, analysis and error compensation of these new two-stage algorithms. Previously we have used computer simulation to study, in a linear approach or with a filter process in reciprocal space, the response of several families of them to the main error sources. In this work we present an error analysis that uses Monte Carlo simulation to achieve results in good agreement with those obtained with spatial and temporal methods.

  14. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    PubMed Central

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  15. Analytical tools in accelerator physics

    SciTech Connect

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  16. Flow-through cross-polarized imaging as a new tool to overcome the analytical sensitivity challenges of a low-dose crystalline compound in a lipid matrix.

    PubMed

    Adler, Camille; Schönenberger, Monica; Teleki, Alexandra; Leuenberger, Bruno; Kuentz, Martin

    2015-11-10

    Assessing the physical state of a low-dose active compound in a solid lipid or polymer matrix is analytically challenging, especially if the matrix exhibits some crystallinity. The aim of this study was first to compare the ability of current methods to detect the presence of a crystalline model compound in lipid matrices. Subsequently, a new technique was introduced and evaluated because of sensitivity issues that were encountered with current methods. The new technique is a flow-through version of cross-polarized imaging in transmission mode. The tested lipid-based solid dispersions (SDs) consisted of β-carotene (BC) as a model compound, and of Gelucire 50/13 or Geleol mono- and diglycerides as lipid matrices. The solid dispersions were analyzed by (hyper) differential scanning calorimetry (DSC), X-ray powder diffraction (XRPD), and microscopic techniques including atomic force microscopy (AFM). DSC and XRPD could analyze crystalline BC at concentrations as low as 3% (w/w) in the formulations. However, with microscopic techniques crystalline particles were detected at significantly lower concentrations of even 0.5% (w/w) BC. A flow-through cross-polarized imaging technique was introduced that combines the advantage of analyzing a larger sample size with high sensitivity of microscopy. Crystals were detected easily in samples containing even less than 0.2% (w/w) BC. Moreover, the new tool enabled approximation of the kinetic BC solubility in the crystalline lipid matrices. As a conclusion, the flow-through cross-polarized imaging technique has the potential to become an indispensable tool for characterizing low-dose crystalline compounds in a lipid or polymer matrix of solid dispersions.

  17. Determination of Unknown Concentrations of Sodium Acetate Using the Method of Standard Addition and Proton NMR: An Experiment for the Undergraduate Analytical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Rajabzadeh, Massy

    2012-01-01

    In this experiment, students learn how to find the unknown concentration of sodium acetate using both the graphical treatment of standard addition and the standard addition equation. In the graphical treatment of standard addition, the peak area of the methyl peak in each of the sodium acetate standard solutions is found by integration using…

  18. Effect of Percent Relative Humidity, Moisture Content, and Compression Force on Light-Induced Fluorescence (LIF) Response as a Process Analytical Tool.

    PubMed

    Shah, Ishan G; Stagner, William C

    2016-08-01

    The effect of percent relative humidity (16-84% RH), moisture content (4.2-6.5% w/w MC), and compression force (4.9-44.1 kN CF) on the light-induced fluorescence (LIF) response of 10% w/w active pharmaceutical ingredient (API) compacts is reported. The fluorescent response was evaluated using two separate central composite designs of experiments. The effect of % RH and CF on the LIF signal was highly significant with an adjusted R (2)  = 0.9436 and p < 0.0001. Percent relative humidity (p = 0.0022), CF (p < 0.0001), and % RH(2) (p = 0.0237) were statistically significant factors affecting the LIF response. The effects of MC and CF on LIF response were also statistically significant with a p value <0.0001 and adjusted R (2) value of 0.9874. The LIF response was highly impacted by MC (p < 0.0001), CF (p < 0.0001), and MC(2) (p = 0022). At 10% w/w API, increased % RH, MC, and CF led to a nonlinear decrease in LIF response. The derived quadratic model equations explained more than 94% of the data. Awareness of these effects on LIF response is critical when implementing LIF as a process analytical tool. PMID:27435199

  19. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    SciTech Connect

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  20. [A need to implement new tools for diagnosing tobacco-addition syndrome and readiness/motivation to quit smoking in the working-age population in Poland].

    PubMed

    Broszkiewicz, Marzenna; Drygas, Wojciech

    2016-01-01

    High rates of tobacco use is still observed in working-age population in Poland. The present level of the state tobacco control has been achieved through adopting legal regulations and population-based interventions. In Poland a sufficient contribution of health professionals to the diagnosis of the tobacco-addition syndrome (TAS) and the application of the 5A's (ask, advice, assess, assist, arrange follow-up) brief intervention, has not been confirmed by explicit research results. Systemic solutions of the health care system of the professional control, specialist health care, health professional trainings and reference centres have not as yet been elaborated. The tools for diagnosing tobacco dependence and motivation to quit smoking, developed over 30 years ago and recommended by experts to be used in clinical and research practice, have not met the current addiction criteria. In this paper other tools than those previously recommended - tests developed in the first decade of the 21st century (including Cigarette Dependence Scale and Nicotine Dependence Syndrome Scale), reflecting modern concepts of nicotine dependence are presented. In the literature on the readiness/motivation to change health behaviors, a new approach dominates. The motivational interviewing (MI) by Miller and Rollnick concentrates on a smoking person and his or her internal motivation. Motivational interviewing is recommended by the World Health Organization as a 5R's (relevance, risks, rewards, roadblocks, repetition) brief motivational advice, addressed to tobacco users who are unwilling to make a quit attempt. In Poland new research studies on the implementation of new diagnostic tools and updating of binding guidelines should be undertaken, to strengthen primary health care in treating tobacco dependence, and to incorporate MI and 5R's into trainings in TAS diagnosing and treating addressed to health professionals. PMID:27044722

  1. [A need to implement new tools for diagnosing tobacco-addition syndrome and readiness/motivation to quit smoking in the working-age population in Poland].

    PubMed

    Broszkiewicz, Marzenna; Drygas, Wojciech

    2016-01-01

    High rates of tobacco use is still observed in working-age population in Poland. The present level of the state tobacco control has been achieved through adopting legal regulations and population-based interventions. In Poland a sufficient contribution of health professionals to the diagnosis of the tobacco-addition syndrome (TAS) and the application of the 5A's (ask, advice, assess, assist, arrange follow-up) brief intervention, has not been confirmed by explicit research results. Systemic solutions of the health care system of the professional control, specialist health care, health professional trainings and reference centres have not as yet been elaborated. The tools for diagnosing tobacco dependence and motivation to quit smoking, developed over 30 years ago and recommended by experts to be used in clinical and research practice, have not met the current addiction criteria. In this paper other tools than those previously recommended - tests developed in the first decade of the 21st century (including Cigarette Dependence Scale and Nicotine Dependence Syndrome Scale), reflecting modern concepts of nicotine dependence are presented. In the literature on the readiness/motivation to change health behaviors, a new approach dominates. The motivational interviewing (MI) by Miller and Rollnick concentrates on a smoking person and his or her internal motivation. Motivational interviewing is recommended by the World Health Organization as a 5R's (relevance, risks, rewards, roadblocks, repetition) brief motivational advice, addressed to tobacco users who are unwilling to make a quit attempt. In Poland new research studies on the implementation of new diagnostic tools and updating of binding guidelines should be undertaken, to strengthen primary health care in treating tobacco dependence, and to incorporate MI and 5R's into trainings in TAS diagnosing and treating addressed to health professionals.

  2. The Pabst's method: an effective and low-budget tool for the forensic comparison of opaque thermoplastics--part 1: Additional discrimination of black electrical tapes.

    PubMed

    Henning, Siegfried; Schönberger, Torsten; Simmross, Ulrich

    2013-12-10

    For many years now, Pabst's micro-press has been used in German forensic science laboratories as a valuable addition to methods of comparative analysis of plastic trace evidence. However, it is as yet hardly known in laboratories outside of Germany. The principal reproducibility is demonstrated by a homogeneity check of a raw backing material of defined origin. The illustrated results of a proficiency test emphasise the applicability of the Pabst method for forensic comparisons. The discrimination power of the Pabst method was tested by taking 90 black PVC-backings provided by the FBI Laboratory, i.e. those that could not be discriminated by standard methods. In this way further discriminations could be achieved. In the following, the Pabst method is therefore introduced as a straightforward, inexpensive and useful tool.

  3. Testing microtaphofacies as an analytic tool for integrated facies and sedimentological analysis using Lower Miocene mixed carbonate/siliciclastic sediments from the North Alpine Foreland Basin

    NASA Astrophysics Data System (ADS)

    Nebelsick, James; Bieg, Ulrich

    2010-05-01

    Taphonomic studies have mostly concentrated on the investigation and quantification of isolated macroscopic faunal and floral elements. Carbonate rocks, in contrary to isolated macroscopic objects, have rarely been specifically addressed in terms of taphonomic features, although many aspects of microfacies analyses are directly related to the preservation of constituent biogenic components. There is thus a high potential for analyzing and quantifying taphonomic features in carbonate rocks (microtaphofacies), not the least as an additional tool for facies analysis. Analyzing the role of taphonomy in carbonate environments can be used to determine how different skeletal architectures through time and evolving synecological relationships (bioerosion and encrustation) have influence carbonate environments and their preservation in the rock record. This pilot study analyses the microtaphofacies of Lower Miocene, shallow water, mixed carbonate - siliciclastic environment from the North Alpine Foreland Basin (Molasse Sea) of southern Germany. The sediments range from biogenic bryomol carbonates to pure siliciclastics. This allows environmental interpretation to be made not only with respect to biogenic composition (dominated by bivalves, gastropods, bryozoans and barnacles), but also to siliciclastic grain characteristics and sedimentary features. Facies interpretation is relatively straight forward with a somewhat varied near shore facies distribution characterized dominated by carbonate which grade into higher energy, siliciclastic offshore sediments. Taphonomic features are assessed along this gradient with respect to total component composition as well as by following the trajectories of individual components types. The results are interpreted with respect to biogenic production, fragmentation, abrasion and transport.

  4. A novel ion-pairing chromatographic method for the simultaneous determination of both nicarbazin components in feed additives: chemometric tools for improving the optimization and validation.

    PubMed

    De Zan, María M; Teglia, Carla M; Robles, Juan C; Goicoechea, Héctor C

    2011-07-15

    The development, optimization and validation of an ion-pairing high performance liquid chromatography method for the simultaneous determination of both nicarbazin (NIC) components: 4,4'-dinitrocarbanilide (DNC) and 2-hydroxy-4,6-dimethylpyrimidine (HDP) in bulk materials and feed additives are described. An experimental design was used for the optimization of the chromatographic system. Four variables, including mobile phase composition and oven temperature, were analyzed through a central composite design exploring their contribution to analyte separation. Five responses: peak resolutions, HDP capacity factor, HDP tailing and analysis time, were modelled by using the response surface methodology and were optimized simultaneously by implementing the desirability function. The optimum conditions resulted in a mobile phase consisting of 10.0 mmol L(-1) of 1-heptanesulfonate, 20.0 mmol L(-1) of sodium acetate, pH=3.30 buffer and acetonitrile in a gradient system at a flow rate of 1.00 mL min(-1). Column was an INERSTIL ODS-3 (4.6 mm×150 mm, 5 μm particle size) at 40.0°C. Detection was performed at 300 nm by a diode array detector. The validation results of the method indicated a high selectivity and good precision characteristics, with RSD less than 1.0% for both components, both in intra and inter-assay precision studies. Linearity was proved for a range of 32.0-50.0 μg mL(-1) of NIC in sample solution. The recovery, studied at three different fortification levels, varied from 98.0 to 101.4 for HDP and from 99.1 to 100.2 for DNC. The applicability of the method was demonstrated by determining DNC and HDP content in raw materials and commercial formulations used for coccidiosis prevention. Assays results on real samples showed that considerable differences in molecular ratio DNC:HDP exist among them.

  5. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  6. Native Mascots and Ethnic Fraud in Higher Education: Using Tribal Critical Race Theory and the Interest Convergence Principle as an Analytic Tool

    ERIC Educational Resources Information Center

    Castagno, Angelina E.; Lee, Stacey J.

    2007-01-01

    This article examines one university's policies regarding Native mascots and ethnic fraud through a Tribal Critical Race Theory analytic lens. Using the principle of interest convergence, we argue that institutions of higher education allow and even work actively towards a particular form or level of diversity, but they do not extend it far…

  7. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  8. Visual Analytics Technology Transition Progress

    SciTech Connect

    Scholtz, Jean; Cook, Kristin A.; Whiting, Mark A.; Lemon, Douglas K.; Greenblatt, Howard

    2009-09-23

    The authors provide a description of the transition process for visual analytic tools and contrast this with the transition process for more traditional software tools. This paper takes this into account and describes a user-oriented approach to technology transition including a discussion of key factors that should be considered and adapted to each situation. The progress made in transitioning visual analytic tools in the past five years is described and the challenges that remain are enumerated.

  9. Phase II Fort Ord Landfill Demonstration Task 8 - Refinement of In-line Instrumental Analytical Tools to Evaluate their Operational Utility and Regulatory Acceptance

    SciTech Connect

    Daley, P F

    2006-04-03

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water sampling and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection

  10. Version 1.00 programmer`s tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    SciTech Connect

    Femec, D.A.

    1995-09-01

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.

  11. Evaluating analytic and risk assessment tools to estimate sediment and nutrients losses from agricultural lands in the southern region of the USA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Non-point source pollution from agricultural fields is a critical problem associated with water quality impairment in the USA and a low-oxygen environment in the Gulf of Mexico. The use, development and enhancement of qualitative and quantitative models or tools for assessing agricultural runoff qua...

  12. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for quantitation of Benazepril alone and in combination with Amlodipine.

    PubMed

    Farouk, M; Elaziz, Omar Abd; Tawakkol, Shereen M; Hemdan, A; Shehata, Mostafa A

    2014-04-01

    Four simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the determination of Benazepril (BENZ) alone and in combination with Amlodipine (AML) in pharmaceutical dosage form. The first method is pH induced difference spectrophotometry, where BENZ can be measured in presence of AML as it showed maximum absorption at 237nm and 241nm in 0.1N HCl and 0.1N NaOH, respectively, while AML has no wavelength shift in both solvents. The second method is the new Extended Ratio Subtraction Method (EXRSM) coupled to Ratio Subtraction Method (RSM) for determination of both drugs in commercial dosage form. The third and fourth methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 2-30μg/mL for BENZ in difference and extended ratio subtraction spectrophotometric method, and 5-30 for AML in EXRSM method, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  13. The color of complexes and UV-vis spectroscopy as an analytical tool of Alfred Werner's group at the University of Zurich.

    PubMed

    Fox, Thomas; Berke, Heinz

    2014-01-01

    Two PhD theses (Alexander Gordienko, 1912; Johannes Angerstein, 1914) and a dissertation in partial fulfillment of a PhD thesis (H. S. French, Zurich, 1914) are reviewed that deal with hitherto unpublished UV-vis spectroscopy work of coordination compounds in the group of Alfred Werner. The method of measurement of UV-vis spectra at Alfred Werner's time is described in detail. Examples of spectra of complexes are given, which were partly interpreted in terms of structure (cis ↔ trans configuration, counting number of bands for structural relationships, and shift of general spectral features by consecutive replacement of ligands). A more complete interpretation of spectra was hampered at Alfred Werner's time by the lack of a light absorption theory and a correct theory of electron excitation, and the lack of a ligand field theory for coordination compounds. The experimentally difficult data acquisitions and the difficult spectral interpretations might have been reasons why this method did not experience a breakthrough in Alfred Werner's group to play a more prominent role as an important analytical method. Nevertheless the application of UV-vis spectroscopy on coordination compounds was unique and novel, and witnesses Alfred Werner's great aptitude and keenness to always try and go beyond conventional practice. PMID:24983805

  14. In-line and real-time process monitoring of a freeze drying process using Raman and NIR spectroscopy as complementary process analytical technology (PAT) tools.

    PubMed

    De Beer, T R M; Vercruysse, P; Burggraeve, A; Quinten, T; Ouyang, J; Zhang, X; Vervaet, C; Remon, J P; Baeyens, W R G

    2009-09-01

    The aim of the present study was to examine the complementary properties of Raman and near infrared (NIR) spectroscopy as PAT tools for the fast, noninvasive, nondestructive and in-line process monitoring of a freeze drying process. Therefore, Raman and NIR probes were built in the freeze dryer chamber, allowing simultaneous process monitoring. A 5% (w/v) mannitol solution was used as model for freeze drying. Raman and NIR spectra were continuously collected during freeze drying (one Raman and NIR spectrum/min) and the spectra were analyzed using principal component analysis (PCA) and multivariate curve resolution (MCR). Raman spectroscopy was able to supply information about (i) the mannitol solid state throughout the entire process, (ii) the endpoint of freezing (endpoint of mannitol crystallization), and (iii) several physical and chemical phenomena occurring during the process (onset of ice nucleation, onset of mannitol crystallization). NIR spectroscopy proved to be a more sensitive tool to monitor the critical aspects during drying: (i) endpoint of ice sublimation and (ii) monitoring the release of hydrate water during storage. Furthermore, via NIR spectroscopy some Raman observations were confirmed: start of ice nucleation, end of mannitol crystallization and solid state characteristics of the end product. When Raman and NIR monitoring were performed on the same vial, the Raman signal was saturated during the freezing step caused by reflected NIR light reaching the Raman detector. Therefore, NIR and Raman measurements were done on a different vial. Also the importance of the position of the probes (Raman probe above the vial and NIR probe at the bottom of the sidewall of the vial) in order to obtain all required critical information is outlined. Combining Raman and NIR spectroscopy for the simultaneous monitoring of freeze drying allows monitoring almost all critical freeze drying process aspects. Both techniques do not only complement each other, they also

  15. An analytical framework and tool ('InteRa') for integrating the informal recycling sector in waste and resource management systems in developing countries.

    PubMed

    Velis, Costas A; Wilson, David C; Rocca, Ondina; Smith, Stephen R; Mavropoulos, Antonis; Cheeseman, Chris R

    2012-09-01

    In low- and middle-income developing countries, the informal (collection and) recycling sector (here abbreviated IRS) is an important, but often unrecognised, part of a city's solid waste and resources management system. Recent evidence shows recycling rates of 20-30% achieved by IRS systems, reducing collection and disposal costs. They play a vital role in the value chain by reprocessing waste into secondary raw materials, providing a livelihood to around 0.5% of urban populations. However, persisting factual and perceived problems are associated with IRS (waste-picking): occupational and public health and safety (H&S), child labour, uncontrolled pollution, untaxed activities, crime and political collusion. Increasingly, incorporating IRS as a legitimate stakeholder and functional part of solid waste management (SWM) is attempted, further building recycling rates in an affordable way while also addressing the negatives. Based on a literature review and a practitioner's workshop, here we develop a systematic framework--or typology--for classifying and analysing possible interventions to promote the integration of IRS in a city's SWM system. Three primary interfaces are identified: between the IRS and the SWM system, the materials and value chain, and society as a whole; underlain by a fourth, which is focused on organisation and empowerment. To maximise the potential for success, IRS integration/inclusion/formalisation initiatives should consider all four categories in a balanced way and pay increased attention to their interdependencies, which are central to success, including specific actions, such as the IRS having access to source separated waste. A novel rapid evaluation and visualisation tool is presented--integration radar (diagram) or InterRa--aimed at illustrating the degree to which a planned or existing intervention considers each of the four categories. The tool is further demonstrated by application to 10 cases around the world, including a step

  16. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  17. Analytical testing

    NASA Technical Reports Server (NTRS)

    Flannelly, W. G.; Fabunmi, J. A.; Nagy, E. J.

    1981-01-01

    Analytical methods for combining flight acceleration and strain data with shake test mobility data to predict the effects of structural changes on flight vibrations and strains are presented. This integration of structural dynamic analysis with flight performance is referred to as analytical testing. The objective of this methodology is to analytically estimate the results of flight testing contemplated structural changes with minimum flying and change trials. The category of changes to the aircraft includes mass, stiffness, absorbers, isolators, and active suppressors. Examples of applying the analytical testing methodology using flight test and shake test data measured on an AH-1G helicopter are included. The techniques and procedures for vibration testing and modal analysis are also described.

  18. Analytical tools for identification of non-intentionally added substances (NIAS) coming from polyurethane adhesives in multilayer packaging materials and their migration into food simulants.

    PubMed

    Félix, Juliana S; Isella, Francesca; Bosetti, Osvaldo; Nerín, Cristina

    2012-07-01

    Adhesives used in food packaging to glue different materials can provide several substances as potential migrants, and the identification of potential migrants and migration tests are required to assess safety in the use of adhesives. Solid-phase microextraction in headspace mode and gas chromatography coupled to mass spectrometry (HS-SPME-GC-MS) and ChemSpider and SciFinder databases were used as powerful tools to identify the potential migrants in the polyurethane (PU) adhesives and also in the individual plastic films (polyethylene terephthalate, polyamide, polypropylene, polyethylene, and polyethylene/ethyl vinyl alcohol). Migration tests were carried out by using Tenax(®) and isooctane as food simulants, and the migrants were analyzed by gas chromatography coupled to mass spectrometry. More than 63 volatile and semivolatile compounds considered as potential migrants were detected either in the adhesives or in the films. Migration tests showed two non-intentionally added substances (NIAS) coming from PU adhesives that migrated through the laminates into Tenax(®) and into isooctane. Identification of these NIAS was achieved through their mass spectra, and 1,6-dioxacyclododecane-7,12-dione and 1,4,7-trioxacyclotridecane-8,13-dione were confirmed. Caprolactam migrated into isooctane, and its origin was the external plastic film in the multilayer, demonstrating real diffusion through the multilayer structure. Comparison of the migration values between the simulants and conditions will be shown and discussed. PMID:22526644

  19. Analytical tools for identification of non-intentionally added substances (NIAS) coming from polyurethane adhesives in multilayer packaging materials and their migration into food simulants.

    PubMed

    Félix, Juliana S; Isella, Francesca; Bosetti, Osvaldo; Nerín, Cristina

    2012-07-01

    Adhesives used in food packaging to glue different materials can provide several substances as potential migrants, and the identification of potential migrants and migration tests are required to assess safety in the use of adhesives. Solid-phase microextraction in headspace mode and gas chromatography coupled to mass spectrometry (HS-SPME-GC-MS) and ChemSpider and SciFinder databases were used as powerful tools to identify the potential migrants in the polyurethane (PU) adhesives and also in the individual plastic films (polyethylene terephthalate, polyamide, polypropylene, polyethylene, and polyethylene/ethyl vinyl alcohol). Migration tests were carried out by using Tenax(®) and isooctane as food simulants, and the migrants were analyzed by gas chromatography coupled to mass spectrometry. More than 63 volatile and semivolatile compounds considered as potential migrants were detected either in the adhesives or in the films. Migration tests showed two non-intentionally added substances (NIAS) coming from PU adhesives that migrated through the laminates into Tenax(®) and into isooctane. Identification of these NIAS was achieved through their mass spectra, and 1,6-dioxacyclododecane-7,12-dione and 1,4,7-trioxacyclotridecane-8,13-dione were confirmed. Caprolactam migrated into isooctane, and its origin was the external plastic film in the multilayer, demonstrating real diffusion through the multilayer structure. Comparison of the migration values between the simulants and conditions will be shown and discussed.

  20. Number series of atoms, interatomic bonds and interface bonds defining zinc-blende nanocrystals as function of size, shape and surface orientation: Analytic tools to interpret solid state spectroscopy data

    NASA Astrophysics Data System (ADS)

    König, Dirk

    2016-08-01

    Semiconductor nanocrystals (NCs) experience stress and charge transfer by embedding materials or ligands and impurity atoms. In return, the environment of NCs experiences a NC stress response which may lead to matrix deformation and propagated strain. Up to now, there is no universal gauge to evaluate the stress impact on NCs and their response as a function of NC size dNC. I deduce geometrical number series as analytical tools to obtain the number of NC atoms NNC(dNC[i]), bonds between NC atoms Nbnd(dNC[i]) and interface bonds NIF(dNC[i]) for seven high symmetry zinc-blende (zb) NCs with low-index faceting: {001} cubes, {111} octahedra, {110} dodecahedra, {001}-{111} pyramids, {111} tetrahedra, {111}-{001} quatrodecahedra and {001}-{111} quadrodecahedra. The fundamental insights into NC structures revealed here allow for major advancements in data interpretation and understanding of zb- and diamond-lattice based nanomaterials. The analytical number series can serve as a standard procedure for stress evaluation in solid state spectroscopy due to their deterministic nature, easy use and general applicability over a wide range of spectroscopy methods as well as NC sizes, forms and materials.

  1. A GC/MS validated method for the nanomolar range determination of succinylacetone in amniotic fluid and plasma: an analytical tool for tyrosinemia type I.

    PubMed

    Cyr, Denis; Giguère, Robert; Villain, Gaëlle; Lemieux, Bernard; Drouin, Régen

    2006-02-17

    A sensitive and accurate stable isotope dilution GC/MS assay was developed and validated for the quantification of succinylacetone (SA) in plasma and amniotic fluid (AF). SA is pathognonomic for tyrosinemia type I, a genetic disorder caused by a reduced activity of fumarylacetoacetate hydrolase (FAH). In untreated patients, SA can easily be measured in plasma and urine because the expected concentrations are in the micromol/L range. Due to a founder effect, the province of Quebec has an unusually high prevalence of tyrosinemia type I, hence, the quantification of SA in AF or plasma of treated patients in the nmol/L range becomes very useful. The method utilizes 13C5-SA as an internal standard and a three-step sample treatment consisting of oximation, solvent extraction and TMCS derivatization. The assay was validated by recording the ion intensities of m/z 620 for SA and m/z 625 for ISTD in order to demonstrate the precision of measurements, the linearity of the method, limit of quantification and detection (LOQ and LOD), specificity, accuracy, as well as metabolite stability. Values for the intra-day assays ranged from 0.2 to 3.2% while values for the inter-day assays ranged from 1.9 to 5.6% confirming that the method has good precision. A calibration plot using SA detected by GC/MS gave excellent linearity with a correlation coefficient of 0.999 over the injected concentration range of 5-2000 nmol/L. LOQ and LOD were 3 and 1 nmol/L, respectively. The usefulness of this method was demonstrated by SA quantification in an AF sample of an affected fetus and in plasma of patients treated with NTBC. The results demonstrate that this novel GC/MS method may be a valuable tool for metabolic evaluation and clinical use.

  2. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  3. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    PubMed

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  4. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  5. Analytical Technology

    SciTech Connect

    Goheen, Steven C.

    2001-07-01

    Characterizing environmental samples has been exhaustively addressed in the literature for most analytes of environmental concern. One of the weak areas of environmental analytical chemistry is that of radionuclides and samples contaminated with radionuclides. The analysis of samples containing high levels of radionuclides can be far more complex than that of non-radioactive samples. This chapter addresses the analysis of samples with a wide range of radioactivity. The other areas of characterization examined in this chapter are the hazardous components of mixed waste, and special analytes often associated with radioactive materials. Characterizing mixed waste is often similar to characterizing waste components in non-radioactive materials. The largest differences are in associated safety precautions to minimize exposure to dangerous levels of radioactivity. One must attempt to keep radiological dose as low as reasonably achievable (ALARA). This chapter outlines recommended procedures to safely and accurately characterize regulated components of radioactive samples.

  6. VA²: A Visual Analytics Approach for // Evaluating Visual Analytics Applications.

    PubMed

    Blascheck, Tanja; John, Markus; Kurzhals, Kuno; Koch, Steffen; Ertl, Thomas

    2016-01-01

    Evaluation has become a fundamental part of visualization research and researchers have employed many approaches from the field of human-computer interaction like measures of task performance, thinking aloud protocols, and analysis of interaction logs. Recently, eye tracking has also become popular to analyze visual strategies of users in this context. This has added another modality and more data, which requires special visualization techniques to analyze this data. However, only few approaches exist that aim at an integrated analysis of multiple concurrent evaluation procedures. The variety, complexity, and sheer amount of such coupled multi-source data streams require a visual analytics approach. Our approach provides a highly interactive visualization environment to display and analyze thinking aloud, interaction, and eye movement data in close relation. Automatic pattern finding algorithms allow an efficient exploratory search and support the reasoning process to derive common eye-interaction-thinking patterns between participants. In addition, our tool equips researchers with mechanisms for searching and verifying expected usage patterns. We apply our approach to a user study involving a visual analytics application and we discuss insights gained from this joint analysis. We anticipate our approach to be applicable to other combinations of evaluation techniques and a broad class of visualization applications.

  7. Determining the Efficacy of Magnetic Susceptibility as an Analytical Tool in the Middle Devonian Gas Bearing Shale of Taylor County, West Virginia

    NASA Astrophysics Data System (ADS)

    Baird, John

    . The accuracy of the magnetic susceptibility of the whole rock was within the same order of magnitude as the other methods, and the accuracy of the magnetic susceptibility of the isolated kerogen component was an order of magnitude higher. In addition, evidence was found that links the magnetic susceptibility of kerogen within the two units to the composition of the kerogen. Vitrinite reflectance data confirms that variations in the magnetic susceptibility of the kerogen was not caused by variations in maturity. A very strong logarithmic relationship was found between the magnetic susceptibility of kerogen and the weight percent present. Using the hypothesis that variations in the amount of organic material present is linked to episodic algal blooms, it was concluded that the organic material supplied by these blooms significantly lowered the magnetic susceptibility of the organic sediment supplied during the normal habitat of the basin.

  8. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-01-01

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed. PMID:26076112

  9. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    PubMed

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched.

  10. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  11. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    PubMed

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. PMID:27271998

  12. Magnetic resonance imaging: A potential tool in assessing the addition of hyperthermia to neoadjuvant therapy in patients with locally advanced breast cancer

    PubMed Central

    CRACIUNESCU, OANA I.; THRALL, DONALD E.; VUJASKOVIC, ZELJKO; DEWHIRST, MARK W.

    2010-01-01

    The poor overall survival for patients with locally advanced breast cancers has led over the past decade to the introduction of numerous neoadjuvant combined therapy regimens to down-stage the disease before surgery. At the same time, more evidence suggests the need for treatment individualisation with a wide variety of new targets for cancer therapeutics and also multi modality therapies. In this context, early determination of whether the patient will fail to respond can enable the use of alternative therapies that can be more beneficial. The purpose of this review is to examine the potential role of magnetic resonance imaging (MRI) in early prediction of treatment response and prognosis of overall survival in locally advanced breast cancer patients enrolled on multi modality therapy trials that include hyperthermia. The material is organised with a review of dynamic contrast (DCE)-MRI and diffusion weighted (DW)-MRI for characterisation of phenomenological parameters of tumour physiology and their potential role in estimating therapy response. Most of the work published in this field has focused on responses to neoadjuvant chemotherapy regimens alone, so the emphasis will be there, however the available data that involves the addition of hyperthermia to the regimen will be discussed The review will also include future directions that include the potential use of MRI imaging techniques in establishing the role of hyperthermia alone in modifying breast tumour microenvironment, together with specific challenges related to performing such studies. PMID:20849258

  13. Analytical sedimentology

    SciTech Connect

    Lewis, D.W. . Dept. of Geology); McConchie, D.M. . Centre for Coastal Management)

    1994-01-01

    Both a self instruction manual and a cookbook'' guide to field and laboratory analytical procedures, this book provides an essential reference for non-specialists. With a minimum of mathematics and virtually no theory, it introduces practitioners to easy, inexpensive options for sample collection and preparation, data acquisition, analytic protocols, result interpretation and verification techniques. This step-by-step guide considers the advantages and limitations of different procedures, discusses safety and troubleshooting, and explains support skills like mapping, photography and report writing. It also offers managers, off-site engineers and others using sediments data a quick course in commissioning studies and making the most of the reports. This manual will answer the growing needs of practitioners in the field, either alone or accompanied by Practical Sedimentology, which surveys the science of sedimentology and provides a basic overview of the principles behind the applications.

  14. Epilepsy analytic system with cloud computing.

    PubMed

    Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei

    2013-01-01

    Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.

  15. Uncertainty profiles for the validation of analytical methods.

    PubMed

    Saffaj, T; Ihssane, B

    2011-09-15

    This article aims to expose a new global strategy for the validation of analytical methods and the estimation of measurement uncertainty. Our purpose is to allow to researchers in the field of analytical chemistry get access to a powerful tool for the evaluation of quantitative analytical procedures. Indeed, the proposed strategy facilitates analytical validation by providing a decision tool based on the uncertainty profile and the β-content tolerance interval. Equally important, this approach allows a good estimate of measurement uncertainty by using data validation and without recourse to other additional experiments. In the example below, we confirmed the applicability of this new strategy for the validation of a chromatographic bioanalytical method and the good estimate of the measurement uncertainty without referring to any extra effort and additional experiments. A comparative study with the SFSTP approach showed that both strategies have selected the same calibration functions. The holistic character of the measurement uncertainty compared to the total error was influenced by our choice of profile uncertainty. Nevertheless, we think that the adoption of the uncertainty in the validation stage controls the risk of using the analytical method in routine phase.

  16. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  17. Nanomaterials as Analytical Tools for Genosensors

    PubMed Central

    Abu-Salah, Khalid M.; Alrokyan, Salman A.; Khan, Muhammad Naziruddin; Ansari, Anees Ahmad

    2010-01-01

    Nanomaterials are being increasingly used for the development of electrochemical DNA biosensors, due to the unique electrocatalytic properties found in nanoscale materials. They offer excellent prospects for interfacing biological recognition events with electronic signal transduction and for designing a new generation of bioelectronic devices exhibiting novel functions. In particular, nanomaterials such as noble metal nanoparticles (Au, Pt), carbon nanotubes (CNTs), magnetic nanoparticles, quantum dots and metal oxide nanoparticles have been actively investigated for their applications in DNA biosensors, which have become a new interdisciplinary frontier between biological detection and material science. In this article, we address some of the main advances in this field over the past few years, discussing the issues and challenges with the aim of stimulating a broader interest in developing nanomaterial-based biosensors and improving their applications in disease diagnosis and food safety examination. PMID:22315580

  18. An Eight-Eyed Version of Hawkins and Shohet's Clinical Supervision Model: The Addition of the Cognitive Analytic Therapy Concept of the "Observing Eye/I" as the "Observing Us"

    ERIC Educational Resources Information Center

    Darongkamas, Jurai; John, Christopher; Walker, Mark James

    2014-01-01

    This paper proposes incorporating the concept of the "observing eye/I", from cognitive analytic therapy (CAT), to Hawkins and Shohet's seven modes of supervision, comprising their transtheoretical model of supervision. Each mode is described alongside explicit examples relating to CAT. This modification using a key idea from CAT (in…

  19. Correlated Raman micro-spectroscopy and scanning electron microscopy analyses of flame retardants in environmental samples: a micro-analytical tool for probing chemical composition, origin and spatial distribution.

    PubMed

    Ghosal, Sutapa; Wagner, Jeff

    2013-07-01

    We present correlated application of two micro-analytical techniques: scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) and Raman micro-spectroscopy (RMS) for the non-invasive characterization and molecular identification of flame retardants (FRs) in environmental dusts and consumer products. The SEM/EDS-RMS technique offers correlated, morphological, molecular, spatial distribution and semi-quantitative elemental concentration information at the individual particle level with micrometer spatial resolution and minimal sample preparation. The presented methodology uses SEM/EDS analyses for rapid detection of particles containing FR specific elements as potential indicators of FR presence in a sample followed by correlated RMS analyses of the same particles for characterization of the FR sub-regions and surrounding matrices. The spatially resolved characterization enabled by this approach provides insights into the distributional heterogeneity as well as potential transfer and exposure mechanisms for FRs in the environment that is typically not available through traditional FR analysis. We have used this methodology to reveal a heterogeneous distribution of highly concentrated deca-BDE particles in environmental dust, sometimes in association with identifiable consumer materials. The observed coexistence of deca-BDE with consumer material in dust is strongly indicative of its release into the environment via weathering/abrasion of consumer products. Ingestion of such enriched FR particles in dust represents a potential for instantaneous exposure to high FR concentrations. Therefore, correlated SEM/RMS analysis offers a novel investigative tool for addressing an area of important environmental concern.

  20. Predictive analytics can support the ACO model.

    PubMed

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  1. GRIPPING TOOL

    DOEpatents

    Sandrock, R.J.

    1961-12-12

    A self-actuated gripping tool is described for transferring fuel elements and the like into reactors and other inaccessible locations. The tool will grasp or release the load only when properly positioned for this purpose. In addition, the load cannot be released except when unsupported by the tool, so that jarring or contact will not bring about accidental release of the load. The gripping members or jaws of the device are cam-actuated by an axially slidable shaft which has two lockable positions. A spring urges the shaft into one position and a solenoid is provided to overcome the spring and move it into the other position. The weight of the tool operates a sleeve to lock the shaft in its existing position. Only when the cable supporting the tool is slack is the device capable of being actuated either to grasp or release its load. (AEC)

  2. Analytical toxicology.

    PubMed

    Flanagan, R J; Widdop, B; Ramsey, J D; Loveland, M

    1988-09-01

    1. Major advances in analytical toxicology followed the introduction of spectroscopic and chromatographic techniques in the 1940s and early 1950s and thin layer chromatography remains important together with some spectrophotometric and other tests. However, gas- and high performance-liquid chromatography together with a variety of immunoassay techniques are now widely used. 2. The scope and complexity of forensic and clinical toxicology continues to increase, although the compounds for which emergency analyses are needed to guide therapy are few. Exclusion of the presence of hypnotic drugs can be important in suspected 'brain death' cases. 3. Screening for drugs of abuse has assumed greater importance not only for the management of the habituated patient, but also in 'pre-employment' and 'employment' screening. The detection of illicit drug administration in sport is also an area of increasing importance. 4. In industrial toxicology, the range of compounds for which blood or urine measurements (so called 'biological monitoring') can indicate the degree of exposure is increasing. The monitoring of environmental contaminants (lead, chlorinated pesticides) in biological samples has also proved valuable. 5. In the near future a consensus as to the units of measurement to be used is urgently required and more emphasis will be placed on interpretation, especially as regards possible behavioural effects of drugs or other poisons. Despite many advances in analytical techniques there remains a need for reliable, simple tests to detect poisons for use in smaller hospital and other laboratories.

  3. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  4. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  5. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  6. Numerical and Analytical Design of Functionally Graded Piezoelectric Transducers

    NASA Astrophysics Data System (ADS)

    Rubio, Wilfredo Montealegre; Buiochi, Flavio; Adamowski, Julio C.; Silva, Emílio Carlos Nelli

    2008-02-01

    This paper presents analytical and finite element methods to model broadband transducers with a graded piezoelectric parameter. The application of FGM (Functionally Graded Materials) concept to piezoelectric transducer design allows the design of composite transducers without interface between materials (e.g. piezoelectric ceramic and backing material), due to the continuous change of property values. Thus, large improvements can be achieved in their performance characteristics, mainly generating short-time waveform ultrasonic pulses. Nevertheless, recent research on functionally graded piezoelectric transducers shows lack of studies that compare numerical and analytical approaches used in their design. In this work analytical and numerical models of FGM piezoelectric transducers are developed to analyze the effects of piezoelectric material gradation, specifically, in ultrasonic applications. In addition, results using FGM piezoelectric transducers are compared with non-FGM piezoelectric transducers. We concluded that the developed modeling techniques are accurate, providing a useful tool for designing FGM piezoelectric transducers.

  7. Specificity is key to successful application of analytics.

    PubMed

    Costello, David; Kaldenberg, Dennis

    2015-02-01

    More data do not necessarily equate to better analytics. Choosing the right analytics tools and applying them to specific areas leads to better results. A Midwest hospital used single-point metrics to identify underperforming facilities and drive improvements.

  8. Collaborative Analytical Toolbox version 1.0

    SciTech Connect

    2008-08-21

    The purpose of the Collaborative Analytical Toolbox (CAT) is to provide a comprehensive, enabling, collaborative problem solving environment that enables users to more effectively apply and improve their analytical and problem solving capabilities. CAT is a software framework for integrating other tools and data sources. It includes a set of core services for collaboration and information exploration and analysis, and a framework that facilitates quickly integrating new ideas, techniques, and tools with existing data sources.

  9. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  10. Analytics for Metabolic Engineering.

    PubMed

    Petzold, Christopher J; Chan, Leanne Jade G; Nhan, Melissa; Adams, Paul D

    2015-01-01

    Realizing the promise of metabolic engineering has been slowed by challenges related to moving beyond proof-of-concept examples to robust and economically viable systems. Key to advancing metabolic engineering beyond trial-and-error research is access to parts with well-defined performance metrics that can be readily applied in vastly different contexts with predictable effects. As the field now stands, research depends greatly on analytical tools that assay target molecules, transcripts, proteins, and metabolites across different hosts and pathways. Screening technologies yield specific information for many thousands of strain variants, while deep omics analysis provides a systems-level view of the cell factory. Efforts focused on a combination of these analyses yield quantitative information of dynamic processes between parts and the host chassis that drive the next engineering steps. Overall, the data generated from these types of assays aid better decision-making at the design and strain construction stages to speed progress in metabolic engineering research.

  11. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  12. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  13. Process Recovery after CaO Addition Due to Granule Formation in a CSTR Co-Digester—A Tool to Influence the Composition of the Microbial Community and Stabilize the Process?

    PubMed Central

    Liebrich, Marietta; Kleyböcker, Anne; Kasina, Monika; Miethling-Graff, Rona; Kassahun, Andrea; Würdemann, Hilke

    2016-01-01

    The composition, structure and function of granules formed during process recovery with calcium oxide in a laboratory-scale fermenter fed with sewage sludge and rapeseed oil were studied. In the course of over-acidification and successful process recovery, only minor changes were observed in the bacterial community of the digestate, while granules appeared during recovery. Fluorescence microscopic analysis of the granules showed a close spatial relationship between calcium and oil and/or long chain fatty acids. This finding further substantiated the hypothesis that calcium precipitated with carbon of organic origin and reduced the negative effects of overloading with oil. Furthermore, the enrichment of phosphate minerals in the granules was shown, and molecular biological analyses detected polyphosphate-accumulating organisms as well as methanogenic archaea in the core. Organisms related to Methanoculleus receptaculi were detected in the inner zones of a granule, whereas they were present in the digestate only after process recovery. This finding indicated more favorable microhabitats inside the granules that supported process recovery. Thus, the granule formation triggered by calcium oxide addition served as a tool to influence the composition of the microbial community and to stabilize the process after overloading with oil.

  14. Process Recovery after CaO Addition Due to Granule Formation in a CSTR Co-Digester-A Tool to Influence the Composition of the Microbial Community and Stabilize the Process?

    PubMed

    Liebrich, Marietta; Kleyböcker, Anne; Kasina, Monika; Miethling-Graff, Rona; Kassahun, Andrea; Würdemann, Hilke

    2016-03-17

    The composition, structure and function of granules formed during process recovery with calcium oxide in a laboratory-scale fermenter fed with sewage sludge and rapeseed oil were studied. In the course of over-acidification and successful process recovery, only minor changes were observed in the bacterial community of the digestate, while granules appeared during recovery. Fluorescence microscopic analysis of the granules showed a close spatial relationship between calcium and oil and/or long chain fatty acids. This finding further substantiated the hypothesis that calcium precipitated with carbon of organic origin and reduced the negative effects of overloading with oil. Furthermore, the enrichment of phosphate minerals in the granules was shown, and molecular biological analyses detected polyphosphate-accumulating organisms as well as methanogenic archaea in the core. Organisms related to Methanoculleus receptaculi were detected in the inner zones of a granule, whereas they were present in the digestate only after process recovery. This finding indicated more favorable microhabitats inside the granules that supported process recovery. Thus, the granule formation triggered by calcium oxide addition served as a tool to influence the composition of the microbial community and to stabilize the process after overloading with oil.

  15. Process Recovery after CaO Addition Due to Granule Formation in a CSTR Co-Digester-A Tool to Influence the Composition of the Microbial Community and Stabilize the Process?

    PubMed

    Liebrich, Marietta; Kleyböcker, Anne; Kasina, Monika; Miethling-Graff, Rona; Kassahun, Andrea; Würdemann, Hilke

    2016-01-01

    The composition, structure and function of granules formed during process recovery with calcium oxide in a laboratory-scale fermenter fed with sewage sludge and rapeseed oil were studied. In the course of over-acidification and successful process recovery, only minor changes were observed in the bacterial community of the digestate, while granules appeared during recovery. Fluorescence microscopic analysis of the granules showed a close spatial relationship between calcium and oil and/or long chain fatty acids. This finding further substantiated the hypothesis that calcium precipitated with carbon of organic origin and reduced the negative effects of overloading with oil. Furthermore, the enrichment of phosphate minerals in the granules was shown, and molecular biological analyses detected polyphosphate-accumulating organisms as well as methanogenic archaea in the core. Organisms related to Methanoculleus receptaculi were detected in the inner zones of a granule, whereas they were present in the digestate only after process recovery. This finding indicated more favorable microhabitats inside the granules that supported process recovery. Thus, the granule formation triggered by calcium oxide addition served as a tool to influence the composition of the microbial community and to stabilize the process after overloading with oil. PMID:27681911

  16. Process Recovery after CaO Addition Due to Granule Formation in a CSTR Co-Digester—A Tool to Influence the Composition of the Microbial Community and Stabilize the Process?

    PubMed Central

    Liebrich, Marietta; Kleyböcker, Anne; Kasina, Monika; Miethling-Graff, Rona; Kassahun, Andrea; Würdemann, Hilke

    2016-01-01

    The composition, structure and function of granules formed during process recovery with calcium oxide in a laboratory-scale fermenter fed with sewage sludge and rapeseed oil were studied. In the course of over-acidification and successful process recovery, only minor changes were observed in the bacterial community of the digestate, while granules appeared during recovery. Fluorescence microscopic analysis of the granules showed a close spatial relationship between calcium and oil and/or long chain fatty acids. This finding further substantiated the hypothesis that calcium precipitated with carbon of organic origin and reduced the negative effects of overloading with oil. Furthermore, the enrichment of phosphate minerals in the granules was shown, and molecular biological analyses detected polyphosphate-accumulating organisms as well as methanogenic archaea in the core. Organisms related to Methanoculleus receptaculi were detected in the inner zones of a granule, whereas they were present in the digestate only after process recovery. This finding indicated more favorable microhabitats inside the granules that supported process recovery. Thus, the granule formation triggered by calcium oxide addition served as a tool to influence the composition of the microbial community and to stabilize the process after overloading with oil. PMID:27681911

  17. NC CATCH: Advancing Public Health Analytics.

    PubMed

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  18. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  19. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  20. Analytical Chemistry of Nitric Oxide

    PubMed Central

    Hetrick, Evan M.

    2013-01-01

    Nitric oxide (NO) is the focus of intense research, owing primarily to its wide-ranging biological and physiological actions. A requirement for understanding its origin, activity, and regulation is the need for accurate and precise measurement techniques. Unfortunately, analytical assays for monitoring NO are challenged by NO’s unique chemical and physical properties, including its reactivity, rapid diffusion, and short half-life. Moreover, NO concentrations may span pM to µM in physiological milieu, requiring techniques with wide dynamic response ranges. Despite such challenges, many analytical techniques have emerged for the detection of NO. Herein, we review the most common spectroscopic and electrochemical methods, with special focus on the fundamentals behind each technique and approaches that have been coupled with modern analytical measurement tools or exploited to create novel NO sensors. PMID:20636069

  1. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  2. Analytical SAR-GMTI principles

    NASA Astrophysics Data System (ADS)

    Soumekh, Mehrdad; Majumder, Uttam K.; Barnes, Christopher; Sobota, David; Minardi, Michael

    2016-05-01

    This paper provides analytical principles to relate the signature of a moving target to parameters in a SAR system. Our objective is to establish analytical tools that could predict the shift and smearing of a moving target in a subaperture SAR image. Hence, a user could identify the system parameters such as the coherent processing interval for a subaperture that is suitable to localize the signature of a moving target for detection, tracking and geolocating the moving target. The paper begins by outlining two well-known SAR data collection methods to detect moving targets. One uses a scanning beam in the azimuth domain with a relatively high PRF to separate the moving targets and the stationary background (clutter); this is also known as Doppler Beam Sharpening. The other scheme uses two receivers along the track to null the clutter and, thus, provide GMTI. We also present results on implementing our SAR-GMTI analytical principles for the anticipated shift and smearing of a moving target in a simulated code. The code would provide a tool for the user to change the SAR system and moving target parameters, and predict the properties of a moving target signature in a subaperture SAR image for a scene that is composed of both stationary and moving targets. Hence, the SAR simulation and imaging code could be used to demonstrate the validity and accuracy of the above analytical principles to predict the properties of a moving target signature in a subaperture SAR image.

  3. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  4. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  5. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  6. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  7. Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics

    PubMed Central

    Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce

    2013-01-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328

  8. Nutritional lipidomics: molecular metabolism, analytics, and diagnostics.

    PubMed

    Smilowitz, Jennifer T; Zivkovic, Angela M; Wan, Yu-Jui Yvonne; Watkins, Steve M; Nording, Malin L; Hammock, Bruce D; German, J Bruce

    2013-08-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of MS with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative, and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions, and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways.

  9. MATLAB/Simulink analytic radar modeling environment

    NASA Astrophysics Data System (ADS)

    Esken, Bruce L.; Clayton, Brian L.

    2001-09-01

    Analytic radar models are simulations based on abstract representations of the radar, the RF environment that radar signals are propagated, and the reflections produced by targets, clutter and multipath. These models have traditionally been developed in FORTRAN and have evolved over the last 20 years into efficient and well-accepted codes. However, current models are limited in two primary areas. First, by the nature of algorithm based analytical models, they can be difficult to understand by non-programmers and equally difficult to modify or extend. Second, there is strong interest in re-using these models to support higher-level weapon system and mission level simulations. To address these issues, a model development approach has been demonstrated which utilizes the MATLAB/Simulink graphical development environment. Because the MATLAB/Simulink environment graphically represents model algorithms - thus providing visibility into the model - algorithms can be easily analyzed and modified by engineers and analysts with limited software skills. In addition, software tools have been created that provide for the automatic code generation of C++ objects. These objects are created with well-defined interfaces enabling them to be used by modeling architectures external to the MATLAB/Simulink environment. The approach utilized is generic and can be extended to other engineering fields.

  10. A review of the analytical simulation of aircraft crash dynamics

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Carden, Huey D.; Boitnott, Richard L.; Hayduk, Robert J.

    1990-01-01

    A large number of full scale tests of general aviation aircraft, helicopters, and one unique air-to-ground controlled impact of a transport aircraft were performed. Additionally, research was also conducted on seat dynamic performance, load-limiting seats, load limiting subfloor designs, and emergency-locator-transmitters (ELTs). Computer programs were developed to provide designers with methods for predicting accelerations, velocities, and displacements of collapsing structure and for estimating the human response to crash loads. The results of full scale aircraft and component tests were used to verify and guide the development of analytical simulation tools and to demonstrate impact load attenuating concepts. Analytical simulation of metal and composite aircraft crash dynamics are addressed. Finite element models are examined to determine their degree of corroboration by experimental data and to reveal deficiencies requiring further development.

  11. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    SciTech Connect

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.; Christel, Michael; Ribarsky, Martin W.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  12. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  13. Analytics for Metabolic Engineering

    PubMed Central

    Petzold, Christopher J.; Chan, Leanne Jade G.; Nhan, Melissa; Adams, Paul D.

    2015-01-01

    Realizing the promise of metabolic engineering has been slowed by challenges related to moving beyond proof-of-concept examples to robust and economically viable systems. Key to advancing metabolic engineering beyond trial-and-error research is access to parts with well-defined performance metrics that can be readily applied in vastly different contexts with predictable effects. As the field now stands, research depends greatly on analytical tools that assay target molecules, transcripts, proteins, and metabolites across different hosts and pathways. Screening technologies yield specific information for many thousands of strain variants, while deep omics analysis provides a systems-level view of the cell factory. Efforts focused on a combination of these analyses yield quantitative information of dynamic processes between parts and the host chassis that drive the next engineering steps. Overall, the data generated from these types of assays aid better decision-making at the design and strain construction stages to speed progress in metabolic engineering research. PMID:26442249

  14. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  15. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  16. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  17. SNL software manual for the ACS Data Analytics Project.

    SciTech Connect

    Stearley, Jon R.; McLendon, William Clarence, III; Rodrigues, Arun F.; Williams, Aaron S.; Hooper, Russell Warren; Robinson, David Gerald; Stickland, Michael G.

    2011-10-01

    In the ACS Data Analytics Project (also known as 'YumYum'), a supercomputer is modeled as a graph of components and dependencies, jobs and faults are simulated, and component fault rates are estimated using the graph structure and job pass/fail outcomes. This report documents the successful completion of all SNL deliverables and tasks, describes the software written by SNL for the project, and presents the data it generates. Readers should understand what the software tools are, how they fit together, and how to use them to reproduce the presented data and additional experiments as desired. The SNL YumYum tools provide the novel simulation and inference capabilities desired by ACS. SNL also developed and implemented a new algorithm, which provides faster estimates, at finer component granularity, on arbitrary directed acyclic graphs.

  18. Immediate tool incorporation processes determine human motor planning with tools.

    PubMed

    Ganesh, G; Yoshioka, T; Osu, R; Ikegami, T

    2014-01-01

    Human dexterity with tools is believed to stem from our ability to incorporate and use tools as parts of our body. However tool incorporation, evident as extensions in our body representation and peri-personal space, has been observed predominantly after extended tool exposures and does not explain our immediate motor behaviours when we change tools. Here we utilize two novel experiments to elucidate the presence of additional immediate tool incorporation effects that determine motor planning with tools. Interestingly, tools were observed to immediately induce a trial-by-trial, tool length dependent shortening of the perceived limb lengths, opposite to observations of elongations after extended tool use. Our results thus exhibit that tools induce a dual effect on our body representation; an immediate shortening that critically affects motor planning with a new tool, and the slow elongation, probably a consequence of skill related changes in sensory-motor mappings with the repeated use of the tool. PMID:25077612

  19. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data. PMID:24806630

  20. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  1. ARVO-CL: The OpenCL version of the ARVO package — An efficient tool for computing the accessible surface area and the excluded volume of proteins via analytical equations

    NASA Astrophysics Data System (ADS)

    Buša, Ján; Hayryan, Shura; Wu, Ming-Chya; Buša, Ján; Hu, Chin-Kun

    2012-11-01

    Introduction of Graphical Processing Units (GPUs) and computing using GPUs in recent years opened possibilities for simple parallelization of programs. In this update, we present the modernized version of program ARVO [J. Buša, J. Dzurina, E. Hayryan, S. Hayryan, C.-K. Hu, J. Plavka, I. Pokorný, J. Skivánek, M.-C. Wu, Comput. Phys. Comm. 165 (2005) 59]. The whole package has been rewritten in the C language and parallelized using OpenCL. Some new tricks have been added to the algorithm in order to save memory much needed for efficient usage of graphical cards. A new tool called ‘input_structure’ was added for conversion of pdb files into files suitable for work with the C and OpenCL version of ARVO.

  2. Analytical Chemistry in Russia.

    PubMed

    Zolotov, Yuri

    2016-09-01

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  3. Science Update: Analytical Chemistry.

    ERIC Educational Resources Information Center

    Worthy, Ward

    1980-01-01

    Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)

  4. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  5. Analytical Aspects of the Implementation of Biomarkers in Clinical Transplantation.

    PubMed

    Shipkova, Maria; López, Olga Millán; Picard, Nicolas; Noceti, Ofelia; Sommerer, Claudia; Christians, Uwe; Wieland, Eberhard

    2016-04-01

    In response to the urgent need for new reliable biomarkers to complement the guidance of the immunosuppressive therapy, a huge number of biomarker candidates to be implemented in clinical practice have been introduced to the transplant community. This includes a diverse range of molecules with very different molecular weights, chemical and physical properties, ex vivo stabilities, in vivo kinetic behaviors, and levels of similarity to other molecules, etc. In addition, a large body of different analytical techniques and assay protocols can be used to measure biomarkers. Sometimes, a complex software-based data evaluation is a prerequisite for appropriate interpretation of the results and for their reporting. Although some analytical procedures are of great value for research purposes, they may be too complex for implementation in a clinical setting. Whereas the proof of "fitness for purpose" is appropriate for validation of biomarker assays used in exploratory drug development studies, a higher level of analytical validation must be achieved and eventually advanced analytical performance might be necessary before diagnostic application in transplantation medicine. A high level of consistency of results between laboratories and between methods (if applicable) should be obtained and maintained to make biomarkers effective instruments in support of therapeutic decisions. This overview focuses on preanalytical and analytical aspects to be considered for the implementation of new biomarkers for adjusting immunosuppression in a clinical setting and highlights critical points to be addressed on the way to make them suitable as diagnostic tools. These include but are not limited to appropriate method validation, standardization, education, automation, and commercialization.

  6. Lynx: a knowledge base and an analytical workbench for integrative medicine.

    PubMed

    Sulakhe, Dinanath; Xie, Bingqing; Taylor, Andrew; D'Souza, Mark; Balasubramanian, Sandhya; Hashemifar, Somaye; White, Steven; Dave, Utpal J; Agam, Gady; Xu, Jinbo; Wang, Sheng; Gilliam, T Conrad; Maltsev, Natalia

    2016-01-01

    Lynx (http://lynx.ci.uchicago.edu) is a web-based database and a knowledge extraction engine. It supports annotation and analysis of high-throughput experimental data and generation of weighted hypotheses regarding genes and molecular mechanisms contributing to human phenotypes or conditions of interest. Since the last release, the Lynx knowledge base (LynxKB) has been periodically updated with the latest versions of the existing databases and supplemented with additional information from public databases. These additions have enriched the data annotations provided by Lynx and improved the performance of Lynx analytical tools. Moreover, the Lynx analytical workbench has been supplemented with new tools for reconstruction of co-expression networks and feature-and-network-based prioritization of genetic factors and molecular mechanisms. These developments facilitate the extraction of meaningful knowledge from experimental data and LynxKB. The Service Oriented Architecture provides public access to LynxKB and its analytical tools via user-friendly web services and interfaces.

  7. Lynx: a knowledge base and an analytical workbench for integrative medicine.

    PubMed

    Sulakhe, Dinanath; Xie, Bingqing; Taylor, Andrew; D'Souza, Mark; Balasubramanian, Sandhya; Hashemifar, Somaye; White, Steven; Dave, Utpal J; Agam, Gady; Xu, Jinbo; Wang, Sheng; Gilliam, T Conrad; Maltsev, Natalia

    2016-01-01

    Lynx (http://lynx.ci.uchicago.edu) is a web-based database and a knowledge extraction engine. It supports annotation and analysis of high-throughput experimental data and generation of weighted hypotheses regarding genes and molecular mechanisms contributing to human phenotypes or conditions of interest. Since the last release, the Lynx knowledge base (LynxKB) has been periodically updated with the latest versions of the existing databases and supplemented with additional information from public databases. These additions have enriched the data annotations provided by Lynx and improved the performance of Lynx analytical tools. Moreover, the Lynx analytical workbench has been supplemented with new tools for reconstruction of co-expression networks and feature-and-network-based prioritization of genetic factors and molecular mechanisms. These developments facilitate the extraction of meaningful knowledge from experimental data and LynxKB. The Service Oriented Architecture provides public access to LynxKB and its analytical tools via user-friendly web services and interfaces. PMID:26590263

  8. Standardization guide for construction and use of MORT-type analytic trees

    SciTech Connect

    Buys, J.R.

    1992-02-01

    Since the introduction of MORT (Management Oversight and Risk Tree) technology as a tool for evaluating the success or failure of safety management systems, there has been a proliferation of analytic trees throughout US Department of Energy (DOE) and its contractor organizations. Standard fault tree'' symbols have generally been used in logic diagram or tree construction, but new or revised symbols have also been adopted by various analysts. Additionally, a variety of numbering systems have been used for event identification. The consequent lack of standardization has caused some difficulties in interpreting the trees and following their logic. This guide seeks to correct this problem by providing a standardized system for construction and use of analytic trees. Future publications of the DOE System Safety Development Center (SSDC) will adhere to this guide. It is recommended that other DOE organizations and contractors also adopt this system to achieve intra-DOE uniformity in analytic tree construction.

  9. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  10. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  11. Analytical mass spectrometry

    SciTech Connect

    Not Available

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  12. Analytical mass spectrometry. Abstracts

    SciTech Connect

    Not Available

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  13. Extreme Scale Visual Analytics

    SciTech Connect

    Wong, Pak C.; Shen, Han-Wei; Pascucci, Valerio

    2012-05-08

    Extreme-scale visual analytics (VA) is about applying VA to extreme-scale data. The articles in this special issue examine advances related to extreme-scale VA problems, their analytical and computational challenges, and their real-world applications.

  14. Signals: Applying Academic Analytics

    ERIC Educational Resources Information Center

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  15. Teaching the Analytical Life

    ERIC Educational Resources Information Center

    Jackson, Brian

    2010-01-01

    Using a survey of 138 writing programs, I argue that we must be more explicit about what we think students should get out of analysis to make it more likely that students will transfer their analytical skills to different settings. To ensure our students take analytical skills with them at the end of the semester, we must simplify the task we…

  16. Analytical Aspects of Hydrogen Exchange Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Engen, John R.; Wales, Thomas E.

    2015-07-01

    This article reviews the analytical aspects of measuring hydrogen exchange by mass spectrometry (HX MS). We describe the nature of analytical selectivity in hydrogen exchange, then review the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in HX MS depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that can be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics.

  17. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed. PMID:26631024

  18. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  19. OOTW Force Design Tools

    SciTech Connect

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  20. Advances in analytical technologies for environmental protection and public safety.

    PubMed

    Sadik, O A; Wanekaya, A K; Andreescu, S

    2004-06-01

    Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies. PMID:15173903

  1. New and emerging analytical techniques for marine biotechnology.

    PubMed

    Burgess, J Grant

    2012-02-01

    Marine biotechnology is the industrial, medical or environmental application of biological resources from the sea. Since the marine environment is the most biologically and chemically diverse habitat on the planet, marine biotechnology has, in recent years delivered a growing number of major therapeutic products, industrial and environmental applications and analytical tools. These range from the use of a snail toxin to develop a pain control drug, metabolites from a sea squirt to develop an anti-cancer therapeutic, and marine enzymes to remove bacterial biofilms. In addition, well known and broadly used analytical techniques are derived from marine molecules or enzymes, including green fluorescence protein gene tagging methods and heat resistant polymerases used in the polymerase chain reaction. Advances in bacterial identification, metabolic profiling and physical handling of cells are being revolutionised by techniques such as mass spectrometric analysis of bacterial proteins. Advances in instrumentation and a combination of these physical advances with progress in proteomics and bioinformatics are accelerating our ability to harness biology for commercial gain. Single cell Raman spectroscopy and microfluidics are two emerging techniques which are also discussed elsewhere in this issue. In this review, we provide a brief survey and update of the most powerful and rapidly growing analytical techniques as used in marine biotechnology, together with some promising examples of less well known earlier stage methods which may make a bigger impact in the future.

  2. Percussion tool

    DOEpatents

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  3. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  4. Molecular tools for chemical biotechnology

    PubMed Central

    Galanie, Stephanie; Siddiqui, Michael S.; Smolke, Christina D.

    2013-01-01

    Biotechnological production of high value chemical products increasingly involves engineering in vivo multi-enzyme pathways and host metabolism. Recent approaches to these engineering objectives have made use of molecular tools to advance de novo pathway identification, tunable enzyme expression, and rapid pathway construction. Molecular tools also enable optimization of single enzymes and entire genomes through diversity generation and screening, whole cell analytics, and synthetic metabolic control networks. In this review, we focus on advanced molecular tools and their applications to engineered pathways in host organisms, highlighting the degree to which each tool is generalizable. PMID:23528237

  5. Additive-free digital microfluidics.

    PubMed

    Freire, Sergio L S; Tanner, Brendan

    2013-07-16

    Digital microfluidics, a technique for manipulation of droplets, is becoming increasingly important for the development of miniaturized platforms for laboratory processes. Despite the enthusiasm, droplet motion is frequently hindered by the desorption of proteins or other analytes to surfaces. Current approaches to minimize this unwanted surface fouling involve the addition of extra species to the droplet or its surroundings, which might be problematic depending on the droplet content. Here, a new strategy is introduced to move droplets containing cells and other analytes on solid substrates, without extra moieties; in particular, droplets with bovine serum albumin could be moved at a concentration 2000 times higher than previously reported (without additives). This capability is achieved by using a soot-based superamphiphobic surface combined with a new device geometry, which favors droplet rolling. Contrasting with electrowetting, wetting forces are not required for droplet motion.

  6. Analytical Spectroscopy Using Modular Systems

    NASA Astrophysics Data System (ADS)

    Patterson, Brian M.; Danielson, Neil D.; Lorigan, Gary A.; Sommer, André J.

    2003-12-01

    This article describes the development of three analytical spectroscopy experiments that compare the determination of salicylic acid (SA) content in aspirin tablets. The experiments are based on UV vis, fluorescence, and Raman spectroscopies and utilize modular spectroscopic components. Students assemble their own instruments, optimize them with respect to signal-to-noise, generate calibration curves, determine the SA content in retail aspirin tablets, and assign features in the respective spectra to functional groups within the active material. Using this approach in the discovery-based setting, the students gain invaluable insight into method-specific parameters, such as instrumental components, sample preparation, and analytical capability. In addition, the students learn the fundamentals of fiber optics and signal processing using the low-cost CCD based spectroscopic components.

  7. Analytical laboratory quality audits

    SciTech Connect

    Kelley, William D.

    2001-06-11

    Analytical Laboratory Quality Audits are designed to improve laboratory performance. The success of the audit, as for many activities, is based on adequate preparation, precise performance, well documented and insightful reporting, and productive follow-up. Adequate preparation starts with definition of the purpose, scope, and authority for the audit and the primary standards against which the laboratory quality program will be tested. The scope and technical processes involved lead to determining the needed audit team resources. Contact is made with the auditee and a formal audit plan is developed, approved and sent to the auditee laboratory management. Review of the auditee's quality manual, key procedures and historical information during preparation leads to better checklist development and more efficient and effective use of the limited time for data gathering during the audit itself. The audit begins with the opening meeting that sets the stage for the interactions between the audit team and the laboratory staff. Arrangements are worked out for the necessary interviews and examination of processes and records. The information developed during the audit is recorded on the checklists. Laboratory management is kept informed of issues during the audit so there are no surprises at the closing meeting. The audit report documents whether the management control systems are effective. In addition to findings of nonconformance, positive reinforcement of exemplary practices provides balance and fairness. Audit closure begins with receipt and evaluation of proposed corrective actions from the nonconformances identified in the audit report. After corrective actions are accepted, their implementation is verified. Upon closure of the corrective actions, the audit is officially closed.

  8. Enzymes in Analytical Chemistry.

    ERIC Educational Resources Information Center

    Fishman, Myer M.

    1980-01-01

    Presents tabular information concerning recent research in the field of enzymes in analytic chemistry, with methods, substrate or reaction catalyzed, assay, comments and references listed. The table refers to 128 references. Also listed are 13 general citations. (CS)

  9. Investigation into the phenomena affecting the retention behavior of basic analytes in chaotropic chromatography: Joint effects of the most relevant chromatographic factors and analytes' molecular properties.

    PubMed

    Čolović, Jelena; Kalinić, Marko; Vemić, Ana; Erić, Slavica; Malenović, Anđelija

    2015-12-18

    The aim of this study was to systematically investigate the phenomena affecting the retention behavior of structurally diverse basic drugs in ion-interaction chromatographic systems with chaotropic additives. To this end, the influence of three factors was studied: pH value of the aqueous phase, concentration of sodium hexafluorophosphate, and content of acetonitrile in the mobile phase. Mobile phase pH was found to affect the thermodynamic equilibria in the studied system beyond its effects on the analytes' ionization state. Specifically, increasing pH from 2 to 4 led to longer retention times, even with analytes which remain completely protonated. An explanation for this phenomenon was sought by studying the adsorption behavior of acetonitrile and chaotropic additive onto stationary phase. It was shown that the magnitude of the developed surface potential, which significantly affects retention - increases with pH, and that this can be attributed to the larger surface excess of acetonitrile. To study how analytes' structural properties influence their retention, quantitative structure-retention modeling was performed next. A support vector machine regression model was developed, relating mobile phase constituents and structural descriptors with retention data. While the ETA_EtaP_B_RC and XlogP can be considered as molecular descriptors which describe factors affecting retention in any RP-HPLC system, TDB9p and RDF45p are molecular descriptors which account for spatial arrangement of polarizable atoms and they can clearly relate to analytes' behavior on the stationary phase surface, where the electrostatic potential develops. Complementarity of analytes' structure with that of the electric double layer can be seen as a key factor influencing their retention behavior. Structural diversity of analytes and good predictive capabilities over a range of experimental conditions make the established model a useful tool in predicting retention behavior in the studied

  10. Extreme Scale Visual Analytics

    SciTech Connect

    Steed, Chad A; Potok, Thomas E; Pullum, Laura L; Ramanathan, Arvind; Shipman, Galen M; Thornton, Peter E; Potok, Thomas E

    2013-01-01

    Given the scale and complexity of today s data, visual analytics is rapidly becoming a necessity rather than an option for comprehensive exploratory analysis. In this paper, we provide an overview of three applications of visual analytics for addressing the challenges of analyzing climate, text streams, and biosurveilance data. These systems feature varying levels of interaction and high performance computing technology integration to permit exploratory analysis of large and complex data of global significance.

  11. Liquid chromatography coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry and post-column addition of metal salt solutions as a powerful tool for the metabolic profiling of Fusarium oxysporum.

    PubMed

    Cirigliano, Adriana M; Rodriguez, M Alejandra; Gagliano, M Laura; Bertinetti, Brenda V; Godeas, Alicia M; Cabrera, Gabriela M

    2016-03-25

    Fusarium oxysporum L11 is a non-pathogenic soil-borne fungal strain that yielded an extract that showed antifungal activity against phytopathogens. In this study, reversed-phase high-performance liquid chromatography (RP-HPLC) coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry (API-QTOF-MS) was applied for the comprehensive profiling of the metabolites from the extract. The employed sources were electrospray (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI). Post-column addition of metal solutions of Ca, Cu and Zn(II) was also tested using ESI. A total of 137 compounds were identified or tentatively identified by matching their accurate mass signals, suggested molecular formulae and MS/MS analysis with previously reported data. Some compounds were isolated and identified by NMR. The extract was rich in cyclic peptides like cyclosporins, diketopiperazines and sansalvamides, most of which were new, and are reported here for the first time. The use of post-column addition of metals resulted in a useful strategy for the discrimination of compound classes since specific adducts were observed for the different compound families. This technique also allowed the screening for compounds with metal binding properties. Thus, the applied methodology is a useful choice for the metabolic profiling of extracts and also for the selection of metabolites with potential biological activities related to interactions with metal ions.

  12. Liquid chromatography coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry and post-column addition of metal salt solutions as a powerful tool for the metabolic profiling of Fusarium oxysporum.

    PubMed

    Cirigliano, Adriana M; Rodriguez, M Alejandra; Gagliano, M Laura; Bertinetti, Brenda V; Godeas, Alicia M; Cabrera, Gabriela M

    2016-03-25

    Fusarium oxysporum L11 is a non-pathogenic soil-borne fungal strain that yielded an extract that showed antifungal activity against phytopathogens. In this study, reversed-phase high-performance liquid chromatography (RP-HPLC) coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry (API-QTOF-MS) was applied for the comprehensive profiling of the metabolites from the extract. The employed sources were electrospray (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI). Post-column addition of metal solutions of Ca, Cu and Zn(II) was also tested using ESI. A total of 137 compounds were identified or tentatively identified by matching their accurate mass signals, suggested molecular formulae and MS/MS analysis with previously reported data. Some compounds were isolated and identified by NMR. The extract was rich in cyclic peptides like cyclosporins, diketopiperazines and sansalvamides, most of which were new, and are reported here for the first time. The use of post-column addition of metals resulted in a useful strategy for the discrimination of compound classes since specific adducts were observed for the different compound families. This technique also allowed the screening for compounds with metal binding properties. Thus, the applied methodology is a useful choice for the metabolic profiling of extracts and also for the selection of metabolites with potential biological activities related to interactions with metal ions. PMID:26655791

  13. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  14. FT-Raman and chemometric tools for rapid determination of quality parameters in milk powder: Classification of samples for the presence of lactose and fraud detection by addition of maltodextrin.

    PubMed

    Rodrigues Júnior, Paulo Henrique; de Sá Oliveira, Kamila; de Almeida, Carlos Eduardo Rocha; De Oliveira, Luiz Fernando Cappa; Stephani, Rodrigo; Pinto, Michele da Silva; de Carvalho, Antônio Fernandes; Perrone, Ítalo Tuler

    2016-04-01

    FT-Raman spectroscopy has been explored as a quick screening method to evaluate the presence of lactose and identify milk powder samples adulterated with maltodextrin (2.5-50% w/w). Raman measurements can easily differentiate samples of milk powder, without the need for sample preparation, while traditional quality control methods, including high performance liquid chromatography, are cumbersome and slow. FT-Raman spectra were obtained from samples of whole lactose and low-lactose milk powder, both without and with addition of maltodextrin. Differences were observed between the spectra involved in identifying samples with low lactose content, as well as adulterated samples. Exploratory data analysis using Raman spectroscopy and multivariate analysis was also developed to classify samples with PCA and PLS-DA. The PLS-DA models obtained allowed to correctly classify all samples. These results demonstrate the utility of FT-Raman spectroscopy in combination with chemometrics to infer about the quality of milk powder.

  15. Ootw Tool Requirements in Relation to JWARS

    SciTech Connect

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  16. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  17. Developing Guidelines for Assessing Visual Analytics Environments

    SciTech Connect

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domains and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.

  18. MassyTools: A High-Throughput Targeted Data Processing Tool for Relative Quantitation and Quality Control Developed for Glycomic and Glycoproteomic MALDI-MS.

    PubMed

    Jansen, Bas C; Reiding, Karli R; Bondt, Albert; Hipgrave Ederveen, Agnes L; Palmblad, Magnus; Falck, David; Wuhrer, Manfred

    2015-12-01

    The study of N-linked glycosylation has long been complicated by a lack of bioinformatics tools. In particular, there is still a lack of fast and robust data processing tools for targeted (relative) quantitation. We have developed modular, high-throughput data processing software, MassyTools, that is capable of calibrating spectra, extracting data, and performing quality control calculations based on a user-defined list of glycan or glycopeptide compositions. Typical examples of output include relative areas after background subtraction, isotopic pattern-based quality scores, spectral quality scores, and signal-to-noise ratios. We demonstrated MassyTools' performance on MALDI-TOF-MS glycan and glycopeptide data from different samples. MassyTools yielded better calibration than the commercial software flexAnalysis, generally showing 2-fold better ppm errors after internal calibration. Relative quantitation using MassyTools and flexAnalysis gave similar results, yielding a relative standard deviation (RSD) of the main glycan of ~6%. However, MassyTools yielded 2- to 5-fold lower RSD values for low-abundant analytes than flexAnalysis. Additionally, feature curation based on the computed quality criteria improved the data quality. In conclusion, we show that MassyTools is a robust automated data processing tool for high-throughput, high-performance glycosylation analysis. The package is released under the Apache 2.0 license and is freely available on GitHub ( https://github.com/Tarskin/MassyTools ).

  19. Enterprise integration: A tool`s perspective

    SciTech Connect

    Polito, J.; Jones, A.; Grant, H.

    1993-06-01

    The advent of sophisticated automation equipment and computer hardware and software is changing the way manufacturing is carried out. To compete in the global marketplace, manufacturing companies must integrate these new technologies into their factories. In addition, they must integrate the planning, control, and data management methodologies needed to make effective use of these technologies. This paper provides an overview of recent approaches to achieving this enterprise integration. It then describes, using simulation as a particular example, a new tool`s perspective of enterprise integration.

  20. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  1. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    SciTech Connect

    Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  2. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    SciTech Connect

    Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.

    2013-01-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today s increasing complex, multivariate data sets. In this paper, a visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today s data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. This chapter provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  3. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    SciTech Connect

    Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  4. Omics Tools

    SciTech Connect

    Schaumberg, Andrew

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not contain Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args

  5. Omics Tools

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not containmore » Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args« less

  6. Tool use by aquatic animals

    PubMed Central

    Mann, Janet; Patterson, Eric M.

    2013-01-01

    Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour. PMID:24101631

  7. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  8. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  9. Metabolomics and diabetes: analytical and computational approaches.

    PubMed

    Sas, Kelli M; Karnovsky, Alla; Michailidis, George; Pennathur, Subramaniam

    2015-03-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  10. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  11. Analytical pervaporation: a key technique in the enological laboratory.

    PubMed

    Luque de Castro, Maria D; Luque-García, Jose L; Mataix, Eva

    2003-01-01

    This paper reviews the use of analytical pervaporation (defined as the integration of 2 different analytical separation principles, evaporation and gas diffusion, in a single micromodule) coupled to flow-injection manifolds for the determination of analytes of interest in enology; the review discusses the advantages that these techniques can provide in wine analytical laboratories. Special attention is given to methods that enable the determination of either of 2 volatile analytes, or of one volatile analyte and one nonvolatile analyte by taking advantage of the versatility of the designed approaches. In a comparison of these methods with the official and/or standard methods, the results showed good agreement. In addition, the new methods offer improvements in linear determination range, quantitation limit, precision, rapidity, and potential for full automation. Thus, this review demonstrates that although the old technologies used in wine analytical laboratories may be supported by official and standard methods, they should be replaced by properly validated, new, and automated technologies.

  12. Frontiers in analytical chemistry

    SciTech Connect

    Amato, I.

    1988-12-15

    Doing more with less was the modus operandi of R. Buckminster Fuller, the late science genius, and inventor of such things as the geodesic dome. In late September, chemists described their own version of this maxim--learning more chemistry from less material and in less time--in a symposium titled Frontiers in Analytical Chemistry at the 196th National Meeting of the American Chemical Society in Los Angeles. Symposium organizer Allen J. Bard of the University of Texas at Austin assembled six speakers, himself among them, to survey pretty widely different areas of analytical chemistry.

  13. Monitoring the analytic surface.

    PubMed

    Spence, D P; Mayes, L C; Dahl, H

    1994-01-01

    How do we listen during an analytic hour? Systematic analysis of the speech patterns of one patient (Mrs. C.) strongly suggests that the clustering of shared pronouns (e.g., you/me) represents an important aspect of the analytic surface, preconsciously sensed by the analyst and used by him to determine when to intervene. Sensitivity to these patterns increases over the course of treatment, and in a final block of 10 hours shows a striking degree of contingent responsivity: specific utterances by the patient are consistently echoed by the analyst's interventions. PMID:8182248

  14. Jupiter Environment Tool

    NASA Technical Reports Server (NTRS)

    Sturm, Erick J.; Monahue, Kenneth M.; Biehl, James P.; Kokorowski, Michael; Ngalande, Cedrick,; Boedeker, Jordan

    2012-01-01

    The Jupiter Environment Tool (JET) is a custom UI plug-in for STK that provides an interface to Jupiter environment models for visualization and analysis. Users can visualize the different magnetic field models of Jupiter through various rendering methods, which are fully integrated within STK s 3D Window. This allows users to take snapshots and make animations of their scenarios with magnetic field visualizations. Analytical data can be accessed in the form of custom vectors. Given these custom vectors, users have access to magnetic field data in custom reports, graphs, access constraints, coverage analysis, and anywhere else vectors are used within STK.

  15. Graphical Contingency Analysis Tool

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identifymore » the best action by interactively evaluate candidate actions.« less

  16. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  17. Monitoring automotive oil degradation: analytical tools and onboard sensing technologies.

    PubMed

    Mujahid, Adnan; Dickert, Franz L

    2012-09-01

    Engine oil experiences a number of thermal and oxidative phases that yield acidic products in the matrix consequently leading to degradation of the base oil. Generally, oil oxidation is a complex process and difficult to elucidate; however, the degradation pathways can be defined for almost every type of oil because they mainly depend on the mechanical status and operating conditions. The exact time of oil change is nonetheless difficult to predict, but it is of great interest from an economic and ecological point of view. In order to make a quick and accurate decision about oil changes, onboard assessment of oil quality is highly desirable. For this purpose, a variety of physical and chemical sensors have been proposed along with spectroscopic strategies. We present a critical review of all these approaches and of recent developments to analyze the exact lifetime of automotive engine oil. Apart from their potential for degradation monitoring, their limitations and future perspectives have also been investigated. PMID:22752447

  18. Monitoring automotive oil degradation: analytical tools and onboard sensing technologies.

    PubMed

    Mujahid, Adnan; Dickert, Franz L

    2012-09-01

    Engine oil experiences a number of thermal and oxidative phases that yield acidic products in the matrix consequently leading to degradation of the base oil. Generally, oil oxidation is a complex process and difficult to elucidate; however, the degradation pathways can be defined for almost every type of oil because they mainly depend on the mechanical status and operating conditions. The exact time of oil change is nonetheless difficult to predict, but it is of great interest from an economic and ecological point of view. In order to make a quick and accurate decision about oil changes, onboard assessment of oil quality is highly desirable. For this purpose, a variety of physical and chemical sensors have been proposed along with spectroscopic strategies. We present a critical review of all these approaches and of recent developments to analyze the exact lifetime of automotive engine oil. Apart from their potential for degradation monitoring, their limitations and future perspectives have also been investigated.

  19. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  20. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  1. Feminism as an Analytic Tool for the Study of Science.

    ERIC Educational Resources Information Center

    Keller, Evelyn Fox

    1983-01-01

    It is proposed that a feminist perspective on the scientific enterprise provides the basis for a psychosociology of scientific knowledge, an understanding of the ways in which psychodynamics and social norms interact in the construction and acceptance of claims to scientific knowledge. (MSE)

  2. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) USER MANUAL

    EPA Science Inventory

    ATtlLA is an ArcView extension that allows users to easily calculate many common landscape metrics. GIS expertise is not required, but some experience with ArcView is recommended. Four metric groups are currently included in ATtILA: landscape characteristics, riparian characteris...

  3. Giardia/giardiasis - a perspective on diagnostic and analytical tools.

    PubMed

    Koehler, Anson V; Jex, Aaron R; Haydon, Shane R; Stevens, Melita A; Gasser, Robin B

    2014-01-01

    Giardiasis is a gastrointestinal disease of humans and other animals caused by species of parasitic protists of the genus Giardia. This disease is transmitted mainly via the faecal-oral route (e.g., in water or food) and is of socioeconomic importance worldwide. The accurate detection and genetic characterisation of the different species and population variants (usually referred to as assemblages and/or sub-assemblages) of Giardia are central to understanding their transmission patterns and host spectra. The present article provides a background on Giardia and giardiasis, and reviews some key techniques employed for the identification and genetic characterisation of Giardia in biological samples, the diagnosis of infection and the analysis of genetic variation within and among species of Giardia. Advances in molecular techniques provide a solid basis for investigating the systematics, population genetics, ecology and epidemiology of Giardia species and genotypes as well as the prevention and control of giardiasis.

  4. Immunoassay as an analytical tool in agricultural biotechnology.

    PubMed

    Grothaus, G David; Bandla, Murali; Currier, Thomas; Giroux, Randal; Jenkins, G Ronald; Lipp, Markus; Shan, Guomin; Stave, James W; Pantella, Virginia

    2006-01-01

    Immunoassays for biotechnology engineered proteins are used by AgBiotech companies at numerous points in product development and by feed and food suppliers for compliance and contractual purposes. Although AgBiotech companies use the technology during product development and seed production, other stakeholders from the food and feed supply chains, such as commodity, food, and feed companies, as well as third-party diagnostic testing companies, also rely on immunoassays for a number of purposes. The primary use of immunoassays is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of GM analysis using immunoassays and especially its application to the testing of grains. The 2 most commonly used formats are lateral flow devices (LFD) and plate-based enzyme-linked immunosorbent assays (ELISA). The main applications of both formats are discussed in general, and the benefits and drawbacks are discussed in detail. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effects they may have on the accuracy of the immunoassays. PMID:16915826

  5. SERS as analytical tool for detection of bacteria

    NASA Astrophysics Data System (ADS)

    Cialla, Dana; Rösch, Petra; Möller, Robert; Popp, Jürgen

    2007-07-01

    The detection of single bacteria should be improved by lowering the acquisition time via the application of SERS (surface enhanced Raman spectroscopy). Nano structured colloids or surfaces consisting of gold or silver can be used as SERS active substrates. However, for biological applications mostly gold is used as SERS active substrate since silver is toxic for bacterial cells. Furthermore, the application of gold as a SERS-active substrate allows the usage of Raman excitation wavelengths in the red part of the electromagnetic spectrum. For the SERS investigations on bacteria different colloids (purchased and self prepared, preaggregated and non-aggregated) are chosen as SERS active substrates. The application of different gold colloids under gently mixing conditions to prevent the bacterial damage allowed the recording of reproducible SERS spectra of bacteria. The SERS spectra of B. pumilus are dominated by contributions of ingredients of the outer cell wall, e.g. the peptidoglycan layer. SEM images of the coated bacteria demonstrate the incomplete adsorption most probably due to variations within the binding affinities between different outer cell components and the gold colloids.

  6. Developing SABRE as an analytical tool in NMR

    NASA Astrophysics Data System (ADS)

    Lloyd, Lyrelle Stacey

    Work presented in this thesis centres around the application of the new hyperpolarisation technique, SABRE, within nuclear magnetic resonance spectroscopy, focusing on optimisation of the technique to characterise small organic molecules. While pyridine was employed as a model substrate, studies on a range of molecules are investigated including substituted pyridines, quinolines, thiazoles and indoles are detailed. Initial investigations explored how the properties of the SABRE catalyst effect the extent of polarisation transfer exhibited. The most important of these properties proved to be the rate constants for loss of pyridine and hydrides as these define the contact time of pyridine with the parahydrogen derived hydride ligands in the metal template. The effect of changing the temperature, solvent or concentration of substrate or catalyst are rationalised. For instance, the catalyst ICy(a) exhibits relatively slow ligand exchange rates and increasing the temperature during hyperpolarisation increases the observed signal enhancements. These studies have revealed a second polarisation transfer template can be used with SABRE in which two substrate molecules are bound. This allows the possibility of investigation of larger substrates which might otherwise be too sterically encumbered to bind. Another significant advance relates to the first demonstration that SABRE can be used in conjunction with an automated system designed with Bruker allowing the acquisition of scan averaged, phase cycled and traditional 2D spectra. The system also allowed investigations into the effect of the polarisation transfer field and application of that knowledge to collect single-scan 13C data for characterisation. The successful acquisition of 1H NOESY, 1H-1H COSY, 1H-13C 2D and ultrafast 1H-1H COSY NMR sequences is detailed for a 10 mM concentration sample, with 1H data collected for a 1 mM sample. A range of studies which aim to demonstrate the applicability of SABRE to the characterisation of small molecules and pharmaceuticals have been conducted.

  7. Analytical Services Management System

    SciTech Connect

    Church, Shane; Nigbor, Mike; Hillman, Daniel

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standard chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.

  8. Analytical Services Management System

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standardmore » chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.« less

  9. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  10. Analytics: Changing the Conversation

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2013-01-01

    In this third and concluding discussion on analytics, the author notes that we live in an information culture. We are accustomed to having information instantly available and accessible, along with feedback and recommendations. We want to know what people think and like (or dislike). We want to know how we compare with "others like me."…

  11. Summary of NDE of Additive Manufacturing Efforts in NASA

    NASA Technical Reports Server (NTRS)

    Waller, Jess; Saulsberry, Regor; Parker, Bradford; Hodges, Kenneth; Burke, Eric; Taminger, Karen

    2014-01-01

    (1) General Rationale for Additive Manufacturing (AM): (a) Operate under a 'design-to-constraint' paradigm, make parts too complicated to fabricate otherwise, (b) Reduce weight by 20 percent with monolithic parts, (c) Reduce waste (green manufacturing), (e) Eliminate reliance on Original Equipment Manufacturers for critical spares, and (f) Extend life of in-service parts by innovative repair methods; (2) NASA OSMA NDE of AM State-of-the-Discipline Report; (3) Overview of NASA AM Efforts at Various Centers: (a) Analytical Tools, (b) Ground-Based Fabrication (c) Space-Based Fabrication; and (d) Center Activity Summaries; (4) Overview of NASA NDE data to date on AM parts; and (5) Gap Analysis/Recommendations for NDE of AM.

  12. Benchmarking analytical calculations of proton doses in heterogeneous matter

    SciTech Connect

    Ciangaru, George; Polf, Jerimy C.; Bues, Martin; Smith, Alfred R.

    2005-12-15

    A proton dose computational algorithm, performing an analytical superposition of infinitely narrow proton beamlets (ASPB) is introduced. The algorithm uses the standard pencil beam technique of laterally distributing the central axis broad beam doses according to the Moliere scattering theory extended to slablike varying density media. The purpose of this study was to determine the accuracy of our computational tool by comparing it with experimental and Monte Carlo (MC) simulation data as benchmarks. In the tests, parallel wide beams of protons were scattered in water phantoms containing embedded air and bone materials with simple geometrical forms and spatial dimensions of a few centimeters. For homogeneous water and bone phantoms, the proton doses we calculated with the ASPB algorithm were found very comparable to experimental and MC data. For layered bone slab inhomogeneity in water, the comparison between our analytical calculation and the MC simulation showed reasonable agreement, even when the inhomogeneity was placed at the Bragg peak depth. There also was reasonable agreement for the parallelepiped bone block inhomogeneity placed at various depths, except for cases in which the bone was located in the region of the Bragg peak, when discrepancies were as large as more than 10%. When the inhomogeneity was in the form of abutting air-bone slabs, discrepancies of as much as 8% occurred in the lateral dose profiles on the air cavity side of the phantom. Additionally, the analytical depth-dose calculations disagreed with the MC calculations within 3% of the Bragg peak dose, at the entry and midway depths in the phantom. The distal depth-dose 20%-80% fall-off widths and ranges calculated with our algorithm and the MC simulation were generally within 0.1 cm of agreement. The analytical lateral-dose profile calculations showed smaller (by less than 0.1 cm) 20%-80% penumbra widths and shorter fall-off tails than did those calculated by the MC simulations. Overall

  13. Benchmarking analytical calculations of proton doses in heterogeneous matter.

    PubMed

    Ciangaru, George; Polf, Jerimy C; Bues, Martin; Smith, Alfred R

    2005-12-01

    A proton dose computational algorithm, performing an analytical superposition of infinitely narrow proton beamlets (ASPB) is introduced. The algorithm uses the standard pencil beam technique of laterally distributing the central axis broad beam doses according to the Moliere scattering theory extended to slablike varying density media. The purpose of this study was to determine the accuracy of our computational tool by comparing it with experimental and Monte Carlo (MC) simulation data as benchmarks. In the tests, parallel wide beams of protons were scattered in water phantoms containing embedded air and bone materials with simple geometrical forms and spatial dimensions of a few centimeters. For homogeneous water and bone phantoms, the proton doses we calculated with the ASPB algorithm were found very comparable to experimental and MC data. For layered bone slab inhomogeneity in water, the comparison between our analytical calculation and the MC simulation showed reasonable agreement, even when the inhomogeneity was placed at the Bragg peak depth. There also was reasonable agreement for the parallelepiped bone block inhomogeneity placed at various depths, except for cases in which the bone was located in the region of the Bragg peak, when discrepancies were as large as more than 10%. When the inhomogeneity was in the form of abutting air-bone slabs, discrepancies of as much as 8% occurred in the lateral dose profiles on the air cavity side of the phantom. Additionally, the analytical depth-dose calculations disagreed with the MC calculations within 3% of the Bragg peak dose, at the entry and midway depths in the phantom. The distal depth-dose 20%-80% fall-off widths and ranges calculated with our algorithm and the MC simulation were generally within 0.1 cm of agreement. The analytical lateral-dose profile calculations showed smaller (by less than 0.1 cm) 20%-80% penumbra widths and shorter fall-off tails than did those calculated by the MC simulations. Overall

  14. Predictive Data Tools Find Uses in Schools

    ERIC Educational Resources Information Center

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  15. Tool to Prioritize Energy Efficiency Investments

    SciTech Connect

    Farese, Philip; Gelman, Rachel; Hendron, Robert

    2012-08-01

    To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.

  16. Imaging and Analytics: The changing face of Medical Imaging

    NASA Astrophysics Data System (ADS)

    Foo, Thomas

    There have been significant technological advances in imaging capability over the past 40 years. Medical imaging capabilities have developed rapidly, along with technology development in computational processing speed and miniaturization. Moving to all-digital, the number of images that are acquired in a routine clinical examination has increased dramatically from under 50 images in the early days of CT and MRI to more than 500-1000 images today. The staggering number of images that are routinely acquired poses significant challenges for clinicians to interpret the data and to correctly identify the clinical problem. Although the time provided to render a clinical finding has not substantially changed, the amount of data available for interpretation has grown exponentially. In addition, the image quality (spatial resolution) and information content (physiologically-dependent image contrast) has also increased significantly with advances in medical imaging technology. On its current trajectory, medical imaging in the traditional sense is unsustainable. To assist in filtering and extracting the most relevant data elements from medical imaging, image analytics will have a much larger role. Automated image segmentation, generation of parametric image maps, and clinical decision support tools will be needed and developed apace to allow the clinician to manage, extract and utilize only the information that will help improve diagnostic accuracy and sensitivity. As medical imaging devices continue to improve in spatial resolution, functional and anatomical information content, image/data analytics will be more ubiquitous and integral to medical imaging capability.

  17. PV Hourly Simulation Tool

    SciTech Connect

    Dean, Jesse; Metzger, Ian

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes the option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  18. PV Hourly Simulation Tool

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes themore » option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.« less

  19. Analytical caustic surfaces

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1987-01-01

    This document discusses the determination of caustic surfaces in terms of rays, reflectors, and wavefronts. Analytical caustics are obtained as a family of lines, a set of points, and several types of equations for geometries encountered in optics and microwave applications. Standard methods of differential geometry are applied under different approaches: directly to reflector surfaces, and alternatively, to wavefronts, to obtain analytical caustics of two sheets or branches. Gauss/Seidel aberrations are introduced into the wavefront approach, forcing the retention of all three coefficients of both the first- and the second-fundamental forms of differential geometry. An existing method for obtaining caustic surfaces through exploitation of the singularities in flux density is examined, and several constant-intensity contour maps are developed using only the intrinsic Gaussian, mean, and normal curvatures of the reflector. Numerous references are provided for extending the material of the present document to the morphologies of caustics and their associated diffraction patterns.

  20. Requirements for Predictive Analytics

    SciTech Connect

    Troy Hiltbrand

    2012-03-01

    It is important to have a clear understanding of how traditional Business Intelligence (BI) and analytics are different and how they fit together in optimizing organizational decision making. With tradition BI, activities are focused primarily on providing context to enhance a known set of information through aggregation, data cleansing and delivery mechanisms. As these organizations mature their BI ecosystems, they achieve a clearer picture of the key performance indicators signaling the relative health of their operations. Organizations that embark on activities surrounding predictive analytics and data mining go beyond simply presenting the data in a manner that will allow decisions makers to have a complete context around the information. These organizations generate models based on known information and then apply other organizational data against these models to reveal unknown information.

  1. Analytic ICF Hohlraum Energetics

    SciTech Connect

    Rosen, M D; Hammer, J

    2003-08-27

    We apply recent analytic solutions to the radiation diffusion equation to problems of interest for ICF hohlraums. The solutions provide quantitative values for absorbed energy which are of use for generating a desired radiation temperature vs. time within the hohlraum. Comparison of supersonic and subsonic solutions (heat front velocity faster or slower, respectively, than the speed of sound in the x-ray heated material) suggests that there may be some advantage in using high Z metallic foams as hohlraum wall material to reduce hydrodynamic losses, and hence, net absorbed energy by the walls. Analytic and numerical calculations suggest that the loss per unit area might be reduced {approx} 20% through use of foam hohlraum walls. Reduced hydrodynamic motion of the wall material may also reduce symmetry swings, as found for heavy ion targets.

  2. Nuclear analytical chemistry

    SciTech Connect

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  3. Analytical applications of aptamers

    NASA Astrophysics Data System (ADS)

    Tombelli, S.; Minunni, M.; Mascini, M.

    2007-05-01

    Aptamers are single stranded DNA or RNA ligands which can be selected for different targets starting from a library of molecules containing randomly created sequences. Aptamers have been selected to bind very different targets, from proteins to small organic dyes. Aptamers are proposed as alternatives to antibodies as biorecognition elements in analytical devices with ever increasing frequency. This in order to satisfy the demand for quick, cheap, simple and highly reproducible analytical devices, especially for protein detection in the medical field or for the detection of smaller molecules in environmental and food analysis. In our recent experience, DNA and RNA aptamers, specific for three different proteins (Tat, IgE and thrombin), have been exploited as bio-recognition elements to develop specific biosensors (aptasensors). These recognition elements have been coupled to piezoelectric quartz crystals and surface plasmon resonance (SPR) devices as transducers where the aptamers have been immobilized on the gold surface of the crystals electrodes or on SPR chips, respectively.

  4. Analytic holographic superconductor

    NASA Astrophysics Data System (ADS)

    Herzog, Christopher P.

    2010-06-01

    We investigate a holographic superconductor that admits an analytic treatment near the phase transition. In the dual 3+1-dimensional field theory, the phase transition occurs when a scalar operator of scaling dimension two gets a vacuum expectation value. We calculate current-current correlation functions along with the speed of second sound near the critical temperature. We also make some remarks about critical exponents. An analytic treatment is possible because an underlying Heun equation describing the zero mode of the phase transition has a polynomial solution. Amusingly, the treatment here may generalize for an order parameter with any integer spin, and we propose a Lagrangian for a spin-two holographic superconductor.

  5. Avatars in Analytical Gaming

    SciTech Connect

    Cowell, Andrew J.; Cowell, Amanda K.

    2009-08-29

    This paper discusses the design and use of anthropomorphic computer characters as nonplayer characters (NPC’s) within analytical games. These new environments allow avatars to play a central role in supporting training and education goals instead of planning the supporting cast role. This new ‘science’ of gaming, driven by high-powered but inexpensive computers, dedicated graphics processors and realistic game engines, enables game developers to create learning and training opportunities on par with expensive real-world training scenarios. However, there needs to be care and attention placed on how avatars are represented and thus perceived. A taxonomy of non-verbal behavior is presented and its application to analytical gaming discussed.

  6. Industrial Analytics Corporation

    SciTech Connect

    Industrial Analytics Corporation

    2004-01-30

    The lost foam casting process is sensitive to the properties of the EPS patterns used for the casting operation. In this project Industrial Analytics Corporation (IAC) has developed a new low voltage x-ray instrument for x-ray radiography of very low mass EPS patterns. IAC has also developed a transmitted visible light method for characterizing the properties of EPS patterns. The systems developed are also applicable to other low density materials including graphite foams.

  7. Management Tools

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  8. Integrated Array/Metadata Analytics

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  9. Transcutaneous Measurement of Blood Analyte Concentration Using Raman Spectroscopy

    NASA Astrophysics Data System (ADS)

    Barman, Ishan; Singh, Gajendra P.; Dasari, Ramachandra R.; Feld, Michael S.

    2008-11-01

    Diabetes mellitus is a chronic disorder, affecting nearly 200 million people worldwide. Acute complications, such as hypoglycemia, cardiovascular disease and retinal damage, may occur if the disease is not adequately controlled. As diabetes has no known cure, tight control of glucose levels is critical for the prevention of such complications. Given the necessity for regular monitoring of blood glucose, development of non-invasive glucose detection devices is essential to improve the quality of life in diabetic patients. The commercially available glucose sensors measure the interstitial fluid glucose by electrochemical detection. However, these sensors have severe limitations, primarily related to their invasive nature and lack of stability. This necessitates the development of a truly non-invasive glucose detection technique. NIR Raman Spectroscopy, which combines the substantial penetration depth of NIR light with the excellent chemical specificity of Raman spectroscopy, provides an excellent tool to meet the challenges involved. Additionally, it enables simultaneous determination of multiple blood analytes. Our laboratory has pioneered the use of Raman spectroscopy for blood analytes' detection in biological media. The preliminary success of our non-invasive glucose measurements both in vitro (such as in serum and blood) and in vivo has provided the foundation for the development of feasible clinical systems. However, successful application of this technology still faces a few hurdles, highlighted by the problems of tissue luminescence and selection of appropriate reference concentration. In this article we explore possible avenues to overcome these challenges so that prospective prediction accuracy of blood analytes can be brought to clinically acceptable levels.

  10. Visual Analytics: How Much Visualization and How Much Analytics?

    SciTech Connect

    Keim, Daniel; Mansmann, Florian; Thomas, James J.

    2009-12-16

    The term Visual Analytics has been around for almost five years by now, but still there are on-going discussions about what it actually is and in particular what is new about it. The core of our view on Visual Analytics is the new enabling and accessible analytic reasoning interactions supported by the combination of automated and visual analytics. In this paper, we outline the scope of Visual Analytics using two problem and three methodological classes in order to work out the need for and purpose of Visual Analytics. Thereby, the respective methods are explained plus examples of analytic reasoning interaction leading to a glimpse into the future of how Visual Analytics methods will enable us to go beyond what is possible when separately using the two methods.

  11. Geometric reasoning about assembly tools

    SciTech Connect

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  12. An analytic model for the Phobos surface

    NASA Technical Reports Server (NTRS)

    Duxbury, Thomas C.

    1991-01-01

    Analytic expressions are derived to model the surface topography and the normal to the surface of Phobos. The analytic expressions are comprised of a spherical harmonic expansion for the global figure of Phobos, augmented by addition terms for the large crater Stickney and other craters. Over 300 craters were measured in more than 100 Viking Orbiter images to produce the model. In general, the largest craters were measured since they have a significant effect on topography. The topographic model derived has a global spatial and topographic accuracy ranging from about 100 m in areas having the highest resolution and convergent, stereo coverage, up to 500 m in the poorest areas.

  13. An analytic model for the PHOBOS surface

    NASA Astrophysics Data System (ADS)

    Duxbury, T. C.

    1991-02-01

    Analytic expressions are derived to model the surface topography and the normal to the surface of Phobos. The analytic expressions are comprised of a spherical harmonic expansion for the global figure of Phobos, augmented by addition terms for the large crater Stickney and other craters. Over 300 craters were measured in more than 100 Viking Orbiter images to produce the model. In general, the largest craters were measured since they have a significant effect on topography. The topographic model derived has a global spatial and topographic accuracy ranging from about 100 m in areas having the highest resolution and convergent, stereo coverage, up to 500 m in the poorest areas.

  14. Efficient evaluation of analytic Fukui functions.

    PubMed

    Flores-Moreno, Roberto; Melin, Junia; Ortiz, J V; Merino, Gabriel

    2008-12-14

    An efficient method for the analytic evaluation of Fukui functions is proposed. Working equations are derived and numerical results are used to validate the method on medium size set of molecules. In addition to the obvious advantages of analytic differentiation, the proposed method is efficient enough to be considered a practical alternative to the finite difference formulation used routinely. The reliability of the approximations used here is demonstrated and discussed. Problems found in other methods for prediction of electrophilic centers are corrected automatically when using the new method.

  15. Forecasting Hotspots-A Predictive Analytics Approach.

    PubMed

    Maciejewski, R; Hafen, R; Rudolph, S; Larew, S G; Mitchell, M A; Cleveland, W S; Ebert, D S

    2011-04-01

    Current visual analytics systems provide users with the means to explore trends in their data. Linked views and interactive displays provide insight into correlations among people, events, and places in space and time. Analysts search for events of interest through statistical tools linked to visual displays, drill down into the data, and form hypotheses based upon the available information. However, current systems stop short of predicting events. In spatiotemporal data, analysts are searching for regions of space and time with unusually high incidences of events (hotspots). In the cases where hotspots are found, analysts would like to predict how these regions may grow in order to plan resource allocation and preventative measures. Furthermore, analysts would also like to predict where future hotspots may occur. To facilitate such forecasting, we have created a predictive visual analytics toolkit that provides analysts with linked spatiotemporal and statistical analytic views. Our system models spatiotemporal events through the combination of kernel density estimation for event distribution and seasonal trend decomposition by loess smoothing for temporal predictions. We provide analysts with estimates of error in our modeling, along with spatial and temporal alerts to indicate the occurrence of statistically significant hotspots. Spatial data are distributed based on a modeling of previous event locations, thereby maintaining a temporal coherence with past events. Such tools allow analysts to perform real-time hypothesis testing, plan intervention strategies, and allocate resources to correspond to perceived threats.

  16. Functional Interfaces Constructed by Controlled/Living Radical Polymerization for Analytical Chemistry.

    PubMed

    Wang, Huai-Song; Song, Min; Hang, Tai-Jun

    2016-02-10

    The high-value applications of functional polymers in analytical science generally require well-defined interfaces, including precisely synthesized molecular architectures and compositions. Controlled/living radical polymerization (CRP) has been developed as a versatile and powerful tool for the preparation of polymers with narrow molecular weight distributions and predetermined molecular weights. Among the CRP system, atom transfer radical polymerization (ATRP) and reversible addition-fragmentation chain transfer (RAFT) are well-used to develop new materials for analytical science, such as surface-modified core-shell particles, monoliths, MIP micro- or nanospheres, fluorescent nanoparticles, and multifunctional materials. In this review, we summarize the emerging functional interfaces constructed by RAFT and ATRP for applications in analytical science. Various polymers with precisely controlled architectures including homopolymers, block copolymers, molecular imprinted copolymers, and grafted copolymers were synthesized by CRP methods for molecular separation, retention, or sensing. We expect that the CRP methods will become the most popular technique for preparing functional polymers that can be broadly applied in analytical chemistry.

  17. Functional Interfaces Constructed by Controlled/Living Radical Polymerization for Analytical Chemistry.

    PubMed

    Wang, Huai-Song; Song, Min; Hang, Tai-Jun

    2016-02-10

    The high-value applications of functional polymers in analytical science generally require well-defined interfaces, including precisely synthesized molecular architectures and compositions. Controlled/living radical polymerization (CRP) has been developed as a versatile and powerful tool for the preparation of polymers with narrow molecular weight distributions and predetermined molecular weights. Among the CRP system, atom transfer radical polymerization (ATRP) and reversible addition-fragmentation chain transfer (RAFT) are well-used to develop new materials for analytical science, such as surface-modified core-shell particles, monoliths, MIP micro- or nanospheres, fluorescent nanoparticles, and multifunctional materials. In this review, we summarize the emerging functional interfaces constructed by RAFT and ATRP for applications in analytical science. Various polymers with precisely controlled architectures including homopolymers, block copolymers, molecular imprinted copolymers, and grafted copolymers were synthesized by CRP methods for molecular separation, retention, or sensing. We expect that the CRP methods will become the most popular technique for preparing functional polymers that can be broadly applied in analytical chemistry. PMID:26785308

  18. Extension of the standard addition method by blank addition.

    PubMed

    Steliopoulos, Panagiotis

    2015-01-01

    Standard addition involves adding varying amounts of the analyte to sample portions of fixed mass or fixed volume and submitting those portions to the sample preparation procedure. After measuring the final extract solutions, the observed signals are linearly regressed on the spiked amounts. The original unknown amount is estimated by the opposite of the abscissa intercept of the fitted straight line [1]. A limitation of this method is that only data points with abscissa values equal to and greater than zero are available so that there is no information on whether linearity holds below the spiking level zero. An approach to overcome this limitation is introduced.•Standard addition is combined with blank addition.•Blank addition means that defined mixtures of blank matrix and sample material are subjected to sample preparation to give final extract solutions.•Equations are presented to estimate the original unknown amount and to calculate the 1-2α confidence interval about this estimate using the combined data set.

  19. Extension of the standard addition method by blank addition

    PubMed Central

    Steliopoulos, Panagiotis

    2015-01-01

    Standard addition involves adding varying amounts of the analyte to sample portions of fixed mass or fixed volume and submitting those portions to the sample preparation procedure. After measuring the final extract solutions, the observed signals are linearly regressed on the spiked amounts. The original unknown amount is estimated by the opposite of the abscissa intercept of the fitted straight line [1]. A limitation of this method is that only data points with abscissa values equal to and greater than zero are available so that there is no information on whether linearity holds below the spiking level zero. An approach to overcome this limitation is introduced.•Standard addition is combined with blank addition.•Blank addition means that defined mixtures of blank matrix and sample material are subjected to sample preparation to give final extract solutions.•Equations are presented to estimate the original unknown amount and to calculate the 1-2α confidence interval about this estimate using the combined data set. PMID:26844210

  20. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  1. Proficiency analytical testing program

    SciTech Connect

    Groff, J.H.; Schlecht, P.C.

    1994-03-01

    The Proficiency Analytical Testing (PAT) Program is a collaborative effort of the American Industrial Hygiene Association (AIHA) and researchers at the Centers for Disease Control and Prevention (CDC), National Institute for Occupational Safety and Health (NIOSH). The PAT Program provides quality control reference samples to over 1400 occupational health and environmental laboratories in over 15 countries. Although one objective of the PAT Program is to evaluate the analytical ability of participating laboratories, the primary objective is to assist these laboratories in improving their laboratory performance. Each calendar quarter (designated a round), samples are mailed to participating laboratories and the data are analyzed to evaluate laboratory performance on a series of analyses. Each mailing and subsequent data analysis are completed in time for participants to obtain repeat samples and to correct analytical problems before the next calendar quarter starts. The PAT Program currently includes four sets of samples. A mixture of 3 of the 4 possible metals, and 3 of the 15 possible organic solvents are rotated for each round. Laboratories are evaluated for each analysis by comparing their reported results against an acceptable performance limit for each PAT Program sample the laboratory analyses. Reference laboratories are preselected to provide the performance limits for each sample. These reference laboratories must meet the following criteria: (1) the laboratory was rated proficient in the last PAT evaluation of all the contaminants in the Program; and (2) the laboratory, if located in the United States, is AIHA accredited. Data are acceptable if they fall within the performance limits. Laboratories are rated based upon performance in the PAT Program over the last year (i.e., four calendar quarters), as well as on individual contaminant performance and overall performance. 1 ref., 3 tabs.

  2. Proficiency analytical testing program

    SciTech Connect

    Schlecht, P.C.; Groff, J.H.

    1994-06-01

    The Proficiency Analytical Testing (PAT) Program is a collaborative effort of the American Industrial Hygiene Association (AIHA) and researchers at the Centers for Disease Control and Prevention, National Institute for Occupational Safety and Health (NIOSH). The PAT Program provides quality control reference samples to over 1400 occupational health and environmental laboratories in over 15 countries. Although one objective of the PAT Program is to evaluate the analytical ability of participating laboratories, the primary objective is to assist these laboratories in improving their laboratory performance. Each calendar quarter (designated a round), samples are mailed to participating laboratories and the data are analyzed to evaluate laboratory performance on a series of analyses. Each mailing and subsequent data analysis is completed in time for participants to obtain repeat samples and to correct analytical problems before the next calendar quarter starts. The PAT Program currently includes four sets of samples. A mixture of 3 of the 4 possible metals, and 3 of the 15 possible organic solvents are rotated for each round. Laboratories are evaluated for each analysis by comparing their reported results against an acceptable performance limit for each PAT Program sample the laboratory analyses. Reference laboratories are preselected to provide the performance limits for each sample. These reference laboratories must meet the following criteria: (1) the laboratory was rated proficient in the last PAT evaluation of all the contaminants in the Program; and (2) the laboratory, if located in the United States, is AIHA accredited. Data are acceptable if they fall within the performance limits. Laboratories are rated based upon performance in the PAT Program over the last year (i.e., four calendar quarters), as well as on individual contaminant performance and overall performance. 1 ref., 3 tabs.

  3. Analytical chemistry of nickel.

    PubMed

    Stoeppler, M

    1984-01-01

    Analytical chemists are faced with nickel contents in environmental and biological materials ranging from the mg/kg down to the ng/kg level. Sampling and sample treatment have to be performed with great care at lower levels, and this also applies to enrichment and separation procedures. The classical determination methods formerly used have been replaced almost entirely by different forms of atomic absorption spectrometry. Electroanalytical methods are also of increasing importance and at present provide the most sensitive approach. Despite the powerful methods available, achieving reliable results is still a challenge for the analyst requiring proper quality control measures.

  4. Automation of analytical isotachophoresis

    NASA Technical Reports Server (NTRS)

    Thormann, Wolfgang

    1985-01-01

    The basic features of automation of analytical isotachophoresis (ITP) are reviewed. Experimental setups consisting of narrow bore tubes which are self-stabilized against thermal convection are considered. Sample detection in free solution is discussed, listing the detector systems presently used or expected to be of potential use in the near future. The combination of a universal detector measuring the evolution of ITP zone structures with detector systems specific to desired components is proposed as a concept of an automated chemical analyzer based on ITP. Possible miniaturization of such an instrument by means of microlithographic techniques is discussed.

  5. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  6. Analytical quality--what should we be aiming for?

    PubMed

    Sikaris, Ken

    2008-08-01

    ISO 15189 5.5.1 "The laboratory shall use examination procedures, ... which meet the needs of the users of laboratory services and are appropriate for the examinations. Requirements for analytical quality include: understanding the analytical goal; seeking an assay that fulfills those goals; establishing your own performance with that assay; setting warning and action limits for your assay; applying quality control tools to every important step.

  7. Semi-analytic technique for analyzing mode-locked lasers

    SciTech Connect

    Usechak, N.G.; Agrawal, G.P.

    2005-03-21

    A semi-analytic tool is developed for investigating pulse dynamics in mode-locked lasers. It provides a set of rate equations for pulse energy, width, and chirp, whose solutions predict how these pulse parameters evolve from one round trip to the next and how they approach their final steady-state values. An actively mode-locked laser is investigated using this technique and the results are in excellent agreement with numerical simulations and previous analytical studies.

  8. Risk analytics for hedge funds

    NASA Astrophysics Data System (ADS)

    Pareek, Ankur

    2005-05-01

    The rapid growth of the hedge fund industry presents significant business opportunity for the institutional investors particularly in the form of portfolio diversification. To facilitate this, there is a need to develop a new set of risk analytics for investments consisting of hedge funds, with the ultimate aim to create transparency in risk measurement without compromising the proprietary investment strategies of hedge funds. As well documented in the literature, use of dynamic options like strategies by most of the hedge funds make their returns highly non-normal with fat tails and high kurtosis, thus rendering Value at Risk (VaR) and other mean-variance analysis methods unsuitable for hedge fund risk quantification. This paper looks at some unique concerns for hedge fund risk management and will particularly concentrate on two approaches from physical world to model the non-linearities and dynamic correlations in hedge fund portfolio returns: Self Organizing Criticality (SOC) and Random Matrix Theory (RMT).Random Matrix Theory analyzes correlation matrix between different hedge fund styles and filters random noise from genuine correlations arising from interactions within the system. As seen in the results of portfolio risk analysis, it leads to a better portfolio risk forecastability and thus to optimum allocation of resources to different hedge fund styles. The results also prove the efficacy of self-organized criticality and implied portfolio correlation as a tool for risk management and style selection for portfolios of hedge funds, being particularly effective during non-linear market crashes.

  9. Downhole tool

    SciTech Connect

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  10. Cordless Tool

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Apollo-era technology spurred the development of cordless products that we take for granted everyday. In the 1960s, NASA asked Black Decker to develop a special drill that would be powerful enough to cut through hard layers of the lunar surface and be lightweight, compact, and operate under its own power source, allowing Apollo astronauts to collect lunar samples further away from the Lunar Experiment Module. In response, Black Decker developed a computer program that analyzed and optimized drill motor operations. From their analysis, engineers were able to design a motor that was powerful yet required minimal battery power to operate. Since those first days of cordless products, Black Decker has continued to refine this technology and they now sell their rechargeable products worldwide (i.e. the Dustbuster, cordless tools for home and industrial use, and medical tools.)

  11. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  12. The GNEMRE Dendro Tool.

    SciTech Connect

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  13. CMS tracker visualization tools

    NASA Astrophysics Data System (ADS)

    Mennea, M. S.; Osborne, I.; Regano, A.; Zito, G.

    2005-08-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  14. Automated analytical microarrays: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2008-07-01

    Microarrays provide a powerful analytical tool for the simultaneous detection of multiple analytes in a single experiment. The specific affinity reaction of nucleic acids (hybridization) and antibodies towards antigens is the most common bioanalytical method for generating multiplexed quantitative results. Nucleic acid-based analysis is restricted to the detection of cells and viruses. Antibodies are more universal biomolecular receptors that selectively bind small molecules such as pesticides, small toxins, and pharmaceuticals and to biopolymers (e.g. toxins, allergens) and complex biological structures like bacterial cells and viruses. By producing an appropriate antibody, the corresponding antigenic analyte can be detected on a multiplexed immunoanalytical microarray. Food and water analysis along with clinical diagnostics constitute potential application fields for multiplexed analysis. Diverse fluorescence, chemiluminescence, electrochemical, and label-free microarray readout systems have been developed in the last decade. Some of them are constructed as flow-through microarrays by combination with a fluidic system. Microarrays have the potential to become widely accepted as a system for analytical applications, provided that robust and validated results on fully automated platforms are successfully generated. This review gives an overview of the current research on microarrays with the focus on automated systems and quantitative multiplexed applications.

  15. Evaluation of clinical chemistry analytes from a single mouse using diluted plasma: effective way to reduce the number of animals in toxicity studies.

    PubMed

    Goyal, Vinod Kumar; Pandey, Santosh Kumar; Kakade, Somesh; Nirogi, Ramakrishna

    2016-10-01

    Clinical chemistry is an essential analytical tool in many areas of research, drug assessment and development, and in the evaluation of general health. A certain amount of blood is required to evaluate all blood analytes. Experiments where mice are used, it is difficult to measure all analytes due to the small amount of blood that can be obtained from a single animal. To overcome this problem, separate cohorts of animals are used in toxicity studies for hematology and biochemistry analysis. This requires the use of extra animals and additional resources. Hence interpretation of results derived from using these different animals can be unreliable. This study was undertaken to explore the possibility of using diluted plasma for measuring various biochemistry analytes. Plasma from mice was diluted to 3, 5 and 10-fold with Water for Injection, and various biochemistry analytes were analyzed using an automated analyzer. Results of diluted and undiluted plasma from the same mouse were compared. Most of the analytes from the diluted plasma were found to be well within the ranges of the undiluted plasma except for sodium, potassium and chloride. Diluting plasma to analyze some analytes also freed up undiluted plasma for analyzing electrolytes. In conclusion, in order to obtain reliable and interpretable data from a single mouse it is worthwhile considering diluting the plasma, which should reduce the number of animals used in an experiment.

  16. Tools to assess tissue quality.

    PubMed

    Neumeister, Veronique M

    2014-03-01

    Biospecimen science has recognized the importance of tissue quality for accurate molecular and biomarker analysis and efforts are made to standardize tissue procurement, processing and storage conditions of tissue samples. At the same time the field has emphasized the lack of standardization of processes between different laboratories, the variability inherent in the analytical phase and the lack of control over the pre-analytical phase of tissue processing. The problem extends back into tissue samples in biorepositories, which are often decades old and where documentation about tissue processing might not be available. This review highlights pre-analytical variations in tissue handling, processing, fixation and storage and emphasizes the effects of these variables on nucleic acids and proteins in harvested tissue. Finally current tools for quality control regarding molecular or biomarker analysis are summarized and discussed.

  17. Analytical and Computational Aspects of Collaborative Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    Bilevel problem formulations have received considerable attention as an approach to multidisciplinary optimization in engineering. We examine the analytical and computational properties of one such approach, collaborative optimization. The resulting system-level optimization problems suffer from inherent computational difficulties due to the bilevel nature of the method. Most notably, it is impossible to characterize and hence identify solutions of the system-level problems because the standard first-order conditions for solutions of constrained optimization problems do not hold. The analytical features of the system-level problem make it difficult to apply conventional nonlinear programming algorithms. Simple examples illustrate the analysis and the algorithmic consequences for optimization methods. We conclude with additional observations on the practical implications of the analytical and computational properties of collaborative optimization.

  18. The analytic renormalization group

    NASA Astrophysics Data System (ADS)

    Ferrari, Frank

    2016-08-01

    Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k ∈ Z, associated with the Matsubara frequencies νk = 2 πk / β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct "Analytic Renormalization Group" linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk | < μ (with the possible exception of the zero mode G0), together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk | ≥ μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  19. Infrared Spectroscopy as a Chemical Fingerprinting Tool

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    2003-01-01

    Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. Any sample material that will interact with infrared light produces a spectrum and, although normally associated with organic materials, inorganic compounds may also be infrared active. The technique is rapid, reproducible and usually non-invasive to the sample. That it is non-invasive allows for additional characterization of the original material using other analytical techniques including thermal analysis and RAMAN spectroscopic techniques. With the appropriate accessories, the technique can be used to examine samples in liquid, solid or gas phase. Both aqueous and non-aqueous free-flowing solutions can be analyzed, as can viscous liquids such as heavy oils and greases. Solid samples of varying sizes and shapes may also be examined and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be analyzed. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.

  20. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  1. Knowledge-analytics synergy in Clinical Decision Support.

    PubMed

    Slonim, Noam; Carmeli, Boaz; Goldsteen, Abigail; Keller, Oliver; Kent, Carmel; Rinott, Ruty

    2012-01-01

    Clinical Decision Support (CDS) systems hold tremendous potential for improving patient care. Most existing systems are knowledge-based tools that rely on relatively simple rules. More recent approaches rely on analytics techniques to automatically mine EHR data to reveal meaningful insights. Here, we propose the Knowledge-Analytics Synergy paradigm for CDS, in which we synergistically combine existing relevant knowledge with analytics applied to EHR data. We propose a framework for implementing such a paradigm and demonstrate its principles over real-world clinical and genomic data of hypertensive patients.

  2. TACT: The Action Computation Tool

    NASA Astrophysics Data System (ADS)

    Sanders, Jason L.; Binney, James

    2015-12-01

    The Action Computation Tool (TACT) tests methods for estimating actions, angles and frequencies of orbits in both axisymmetric and triaxial potentials, including general spherical potentials, analytic potentials (Isochrone and Harmonic oscillator), axisymmetric Stackel fudge, average generating function from orbit (AvGF), and others. It is written in C++; code is provided to compile the routines into a Python library. TM (ascl:1512.014) and LAPACK are required to access some features.

  3. Normality in analytical psychology.

    PubMed

    Myers, Steve

    2013-12-01

    Although C.G. Jung's interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault's criticism, had Foucault chosen to review Jung's work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault's own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung's disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  4. Analytic pion form factor

    NASA Astrophysics Data System (ADS)

    Lomon, Earle L.; Pacetti, Simone

    2016-09-01

    The pion electromagnetic form factor and two-pion production in electron-positron collisions are simultaneously fitted by a vector dominance model evolving to perturbative QCD at large momentum transfer. This model was previously successful in simultaneously fitting the nucleon electromagnetic form factors (spacelike region) and the electromagnetic production of nucleon-antinucleon pairs (timelike region). For this pion case dispersion relations are used to produce the analytic connection of the spacelike and timelike regions. The fit to all the data is good, especially for the newer sets of timelike data. The description of high-q2 data, in the timelike region, requires one more meson with ρ quantum numbers than listed in the 2014 Particle Data Group review.

  5. ANALYTIC MODELING OF STARSHADES

    SciTech Connect

    Cash, Webster

    2011-09-01

    External occulters, otherwise known as starshades, have been proposed as a solution to one of the highest priority yet technically vexing problems facing astrophysics-the direct imaging and characterization of terrestrial planets around other stars. New apodization functions, developed over the past few years, now enable starshades of just a few tens of meters diameter to occult central stars so efficiently that the orbiting exoplanets can be revealed and other high-contrast imaging challenges addressed. In this paper, an analytic approach to the analysis of these apodization functions is presented. It is used to develop a tolerance analysis suitable for use in designing practical starshades. The results provide a mathematical basis for understanding starshades and a quantitative approach to setting tolerances.

  6. VERDE Analytic Modules

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates servedmore » within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercial firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.« less

  7. VERDE Analytic Modules

    SciTech Connect

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates served within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercial firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.

  8. Normality in Analytical Psychology

    PubMed Central

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  9. Automation of statistical analysis in the WIPP hazardous waste facility permit for analytical results from characterization

    SciTech Connect

    Shokes, T.; Einerson, J.

    2007-07-01

    program is being developed in conjunction with a Microsoft Access database. The database acts as a central repository for all of the input and output data, records the list of analytes, and provides a graphical user interface for the end-user. The additional benefit to this tool is the incorporation of other calculations. One calculation is the compilation of the volatile organic compound data from 55-gallon drums with rigid liners that will be compacted and placed into a 100-gallon drum. Another analysis addresses the total concentration of flammable VOCs for transportation requirements. Development of such software tools improves the productivity of the groups that perform these functions. (authors)

  10. Time-domain Raman analytical forward solvers.

    PubMed

    Martelli, Fabrizio; Binzoni, Tiziano; Sekar, Sanathana Konugolu Venkata; Farina, Andrea; Cavalieri, Stefano; Pifferi, Antonio

    2016-09-01

    A set of time-domain analytical forward solvers for Raman signals detected from homogeneous diffusive media is presented. The time-domain solvers have been developed for two geometries: the parallelepiped and the finite cylinder. The potential presence of a background fluorescence emission, contaminating the Raman signal, has also been taken into account. All the solvers have been obtained as solutions of the time dependent diffusion equation. The validation of the solvers has been performed by means of comparisons with the results of "gold standard" Monte Carlo simulations. These forward solvers provide an accurate tool to explore the information content encoded in the time-resolved Raman measurements. PMID:27607645

  11. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    SciTech Connect

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  12. ICDA: a platform for Intelligent Care Delivery Analytics.

    PubMed

    Gotz, David; Stavropoulos, Harry; Sun, Jimeng; Wang, Fei

    2012-01-01

    The identification of high-risk patients is a critical component in improving patient outcomes and managing costs. This paper describes the Intelligent Care Delivery Analytics platform (ICDA), a system which enables risk assessment analytics that process large collections of dynamic electronic medical data to identify at-risk patients. ICDA works by ingesting large volumes of data into a common data model, then orchestrating a collection of analytics that identify at-risk patients. It also provides an interactive environment through which users can access and review the analytics results. In addition, ICDA provides APIs via which analytics results can be retrieved to surface in external applications. A detailed review of ICDA's architecture is provided. Descriptions of four use cases are included to illustrate ICDA's application within two different data environments. These use cases showcase the system's flexibility and exemplify the types of analytics it enables. PMID:23304296

  13. ICDA: A Platform for Intelligent Care Delivery Analytics

    PubMed Central

    Gotz, David; Stavropoulos, Harry; Sun, Jimeng; Wang, Fei

    2012-01-01

    The identification of high-risk patients is a critical component in improving patient outcomes and managing costs. This paper describes the Intelligent Care Delivery Analytics platform (ICDA), a system which enables risk assessment analytics that process large collections of dynamic electronic medical data to identify at-risk patients. ICDA works by ingesting large volumes of data into a common data model, then orchestrating a collection of analytics that identify at-risk patients. It also provides an interactive environment through which users can access and review the analytics results. In addition, ICDA provides APIs via which analytics results can be retrieved to surface in external applications. A detailed review of ICDA’s architecture is provided. Descriptions of four use cases are included to illustrate ICDA’s application within two different data environments. These use cases showcase the system’s flexibility and exemplify the types of analytics it enables. PMID:23304296

  14. Analytical Chemistry Laboratory progress report for FY 1989

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1989-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  15. ICDA: a platform for Intelligent Care Delivery Analytics.

    PubMed

    Gotz, David; Stavropoulos, Harry; Sun, Jimeng; Wang, Fei

    2012-01-01

    The identification of high-risk patients is a critical component in improving patient outcomes and managing costs. This paper describes the Intelligent Care Delivery Analytics platform (ICDA), a system which enables risk assessment analytics that process large collections of dynamic electronic medical data to identify at-risk patients. ICDA works by ingesting large volumes of data into a common data model, then orchestrating a collection of analytics that identify at-risk patients. It also provides an interactive environment through which users can access and review the analytics results. In addition, ICDA provides APIs via which analytics results can be retrieved to surface in external applications. A detailed review of ICDA's architecture is provided. Descriptions of four use cases are included to illustrate ICDA's application within two different data environments. These use cases showcase the system's flexibility and exemplify the types of analytics it enables.

  16. Analytical Chemistry Laboratory progress report for FY 1991

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Boparai, A.S.

    1991-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  17. Maximum likelihood molecular clock comb: analytic solutions.

    PubMed

    Chor, Benny; Khetan, Amit; Snir, Sagi

    2006-04-01

    Maximum likelihood (ML) is increasingly used as an optimality criterion for selecting evolutionary trees, but finding the global optimum is a hard computational task. Because no general analytic solution is known, numeric techniques such as hill climbing or expectation maximization (EM), are used in order to find optimal parameters for a given tree. So far, analytic solutions were derived only for the simplest model--three taxa, two state characters, under a molecular clock. Four taxa rooted trees have two topologies--the fork (two subtrees with two leaves each) and the comb (one subtree with three leaves, the other with a single leaf). In a previous work, we devised a closed form analytic solution for the ML molecular clock fork. In this work, we extend the state of the art in the area of analytic solutions ML trees to the family of all four taxa trees under the molecular clock assumption. The change from the fork topology to the comb incurs a major increase in the complexity of the underlying algebraic system and requires novel techniques and approaches. We combine the ultrametric properties of molecular clock trees with the Hadamard conjugation to derive a number of topology dependent identities. Employing these identities, we substantially simplify the system of polynomial equations. We finally use tools from algebraic geometry (e.g., Gröbner bases, ideal saturation, resultants) and employ symbolic algebra software to obtain analytic solutions for the comb. We show that in contrast to the fork, the comb has no closed form solutions (expressed by radicals in the input data). In general, four taxa trees can have multiple ML points. In contrast, we can now prove that under the molecular clock assumption, the comb has a unique (local and global) ML point. (Such uniqueness was previously shown for the fork.).

  18. Algal functional annotation tool

    SciTech Connect

    Lopez, D.; Casero, D.; Cokus, S. J.; Merchant, S. S.; Pellegrini, M.

    2012-07-01

    The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG pathway maps and batch gene identifier conversion.

  19. Tool Gear: Infrastructure for Parallel Tools

    SciTech Connect

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  20. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  1. Non-commutative tools for topological insulators

    NASA Astrophysics Data System (ADS)

    Prodan, Emil

    2010-06-01

    This paper reviews several analytic tools for the field of topological insulators, developed with the aid of non-commutative calculus and geometry. The set of tools includes bulk topological invariants defined directly in the thermodynamic limit and in the presence of disorder, whose robustness is shown to have nontrivial physical consequences for the bulk states. The set of tools also includes a general relation between the current of an observable and its edge index, a relation that can be used to investigate the robustness of the edge states against disorder. The paper focuses on the motivations behind creating such tools and on how to use them.

  2. Visual Analytics Methodology for Eye Movement Studies.

    PubMed

    Andrienko, G; Andrienko, N; Burch, M; Weiskopf, D

    2012-12-01

    Eye movement analysis is gaining popularity as a tool for evaluation of visual displays and interfaces. However, the existing methods and tools for analyzing eye movements and scanpaths are limited in terms of the tasks they can support and effectiveness for large data and data with high variation. We have performed an extensive empirical evaluation of a broad range of visual analytics methods used in analysis of geographic movement data. The methods have been tested for the applicability to eye tracking data and the capability to extract useful knowledge about users' viewing behaviors. This allowed us to select the suitable methods and match them to possible analysis tasks they can support. The paper describes how the methods work in application to eye tracking data and provides guidelines for method selection depending on the analysis tasks.

  3. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> ; KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  4. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  5. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points".

  6. Technology Tools

    ERIC Educational Resources Information Center

    Kennedy, Mike

    2005-01-01

    Personal computers (PCs) have transformed the way teachers teach, students learn, and school operations are conducted. However, the addition of PCs is not the only technological advancement that can help education institutions run more productively. The progress that has made computers smaller, faster and cheaper also has led to the availability…

  7. Limitless Tools

    ERIC Educational Resources Information Center

    Erickson, Paul W.

    2010-01-01

    With the rushing in of new technologies, facilities must be more flexible and adaptable to a variety of learning approaches. As personalized learning plans emerge with technology, new designs make learning possible anywhere at any time. In addition, the change from print to Web-based materials is creating an environment that focuses on…

  8. Analytic sequential methods for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Walker, Ernest

    2014-05-01

    In this paper, we propose an analytic sequential methods for detecting port-scan attackers which routinely perform random "portscans" of IP addresses to find vulnerable servers to compromise. In addition to rigorously control the probability of falsely implicating benign remote hosts as malicious, our method performs significantly faster than other current solutions. We have developed explicit formulae for quick determination of the parameters of the new detection algorithm.

  9. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects.

  10. Hanford transuranic analytical capability

    SciTech Connect

    McVey, C.B.

    1995-02-24

    With the current DOE focus on ER/WM programs, an increase in the quantity of waste samples that requires detailed analysis is forecasted. One of the prime areas of growth is the demand for DOE environmental protocol analyses of TRU waste samples. Currently there is no laboratory capacity to support analysis of TRU waste samples in excess of 200 nCi/gm. This study recommends that an interim solution be undertaken to provide these services. By adding two glove boxes in room 11A of 222S the interim waste analytical needs can be met for a period of four to five years or until a front end facility is erected at or near the 222-S facility. The yearly average of samples is projected to be approximately 600 samples. The figure has changed significantly due to budget changes and has been downgraded from 10,000 samples to the 600 level. Until these budget and sample projection changes become firmer, a long term option is not recommended at this time. A revision to this document is recommended by March 1996 to review the long term option and sample projections.

  11. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  12. Design and Implementation of a Learning Analytics Toolkit for Teachers

    ERIC Educational Resources Information Center

    Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik

    2012-01-01

    Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…

  13. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  14. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  15. Transforming labor-management practices through real-time analytics.

    PubMed

    Nippert, Kathye Habig; Graves, Brian

    2012-06-01

    Catholic Health Partners (CHP) decentralized productivity management, giving its regional executives greater control over their productivity tools and data. CHP retained centralized management of its benchmarking and analytics and created an enterprise database with standardized information. CHP's stakeholders shared accountability and accepted greater responsibility for labor-management decisions.

  16. Making advanced analytics work for you.

    PubMed

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  17. Swift Science Analysis Tools

    NASA Astrophysics Data System (ADS)

    Marshall, F. E.; Swift Team Team

    2003-05-01

    Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (http://swiftsc.gsfc.nasa.gov), and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

  18. Mechanical properties of additively manufactured octagonal honeycombs.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-12-01

    Honeycomb structures have found numerous applications as structural and biomedical materials due to their favourable properties such as low weight, high stiffness, and porosity. Application of additive manufacturing and 3D printing techniques allows for manufacturing of honeycombs with arbitrary shape and wall thickness, opening the way for optimizing the mechanical and physical properties for specific applications. In this study, the mechanical properties of honeycomb structures with a new geometry, called octagonal honeycomb, were investigated using analytical, numerical, and experimental approaches. An additive manufacturing technique, namely fused deposition modelling, was used to fabricate the honeycomb from polylactic acid (PLA). The honeycombs structures were then mechanically tested under compression and the mechanical properties of the structures were determined. In addition, the Euler-Bernoulli and Timoshenko beam theories were used for deriving analytical relationships for elastic modulus, yield stress, Poisson's ratio, and buckling stress of this new design of honeycomb structures. Finite element models were also created to analyse the mechanical behaviour of the honeycombs computationally. The analytical solutions obtained using Timoshenko beam theory were close to computational results in terms of elastic modulus, Poisson's ratio and yield stress, especially for relative densities smaller than 25%. The analytical solutions based on the Timoshenko analytical solution and the computational results were in good agreement with experimental observations. Finally, the elastic properties of the proposed honeycomb structure were compared to those of other honeycomb structures such as square, triangular, hexagonal, mixed, diamond, and Kagome. The octagonal honeycomb showed yield stress and elastic modulus values very close to those of regular hexagonal honeycombs and lower than the other considered honeycombs. PMID:27612831

  19. Mechanical properties of additively manufactured octagonal honeycombs.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-12-01

    Honeycomb structures have found numerous applications as structural and biomedical materials due to their favourable properties such as low weight, high stiffness, and porosity. Application of additive manufacturing and 3D printing techniques allows for manufacturing of honeycombs with arbitrary shape and wall thickness, opening the way for optimizing the mechanical and physical properties for specific applications. In this study, the mechanical properties of honeycomb structures with a new geometry, called octagonal honeycomb, were investigated using analytical, numerical, and experimental approaches. An additive manufacturing technique, namely fused deposition modelling, was used to fabricate the honeycomb from polylactic acid (PLA). The honeycombs structures were then mechanically tested under compression and the mechanical properties of the structures were determined. In addition, the Euler-Bernoulli and Timoshenko beam theories were used for deriving analytical relationships for elastic modulus, yield stress, Poisson's ratio, and buckling stress of this new design of honeycomb structures. Finite element models were also created to analyse the mechanical behaviour of the honeycombs computationally. The analytical solutions obtained using Timoshenko beam theory were close to computational results in terms of elastic modulus, Poisson's ratio and yield stress, especially for relative densities smaller than 25%. The analytical solutions based on the Timoshenko analytical solution and the computational results were in good agreement with experimental observations. Finally, the elastic properties of the proposed honeycomb structure were compared to those of other honeycomb structures such as square, triangular, hexagonal, mixed, diamond, and Kagome. The octagonal honeycomb showed yield stress and elastic modulus values very close to those of regular hexagonal honeycombs and lower than the other considered honeycombs.

  20. Unexpected Analyte Oxidation during Desorption Electrospray Ionization - Mass Spectrometry

    SciTech Connect

    Pasilis, Sofie P; Kertesz, Vilmos; Van Berkel, Gary J

    2008-01-01

    During the analysis of surface spotted analytes using desorption electrospray ionization mass spectrometry (DESI-MS), abundant ions are sometimes observed that appear to be the result of oxygen addition reactions. In this investigation, the effect of sample aging, the ambient lab environment, spray voltage, analyte surface concentration, and surface type on this oxidative modification of spotted analytes, exemplified by tamoxifen and reserpine, during analysis by desorption electrospray ionization mass spectrometry was studied. Simple exposure of the samples to air and to ambient lighting increased the extent of oxidation. Increased spray voltage lead also to increased analyte oxidation, possibly as a result of oxidative species formed electrochemically at the emitter electrode or in the gas - phase by discharge processes. These oxidative species are carried by the spray and impinge on and react with the sampled analyte during desorption/ionization. The relative abundance of oxidized species was more significant for analysis of deposited analyte having a relatively low surface concentration. Increasing spray solvent flow rate and addition of hydroquinone as a redox buffer to the spray solvent were found to decrease, but not entirely eliminate, analyte oxidation during analysis. The major parameters that both minimize and maximize analyte oxidation were identified and DESI-MS operational recommendations to avoid these unwanted reactions are suggested.

  1. Analytical model of internally coupled ears.

    PubMed

    Vossen, Christine; Christensen-Dalsgaard, Jakob; van Hemmen, J Leo

    2010-08-01

    Lizards and many birds possess a specialized hearing mechanism: internally coupled ears where the tympanic membranes connect through a large mouth cavity so that the vibrations of the tympanic membranes influence each other. This coupling enhances the phase differences and creates amplitude differences in the tympanic membrane vibrations. Both cues show strong directionality. The work presented herein sets out the derivation of a three dimensional analytical model of internally coupled ears that allows for calculation of a complete vibration profile of the membranes. The analytical model additionally provides the opportunity to incorporate the effect of the asymmetrically attached columella, which leads to the activation of higher membrane vibration modes. Incorporating this effect, the analytical model can explain measurements taken from the tympanic membrane of a living lizard, for example, data demonstrating an asymmetrical spatial pattern of membrane vibration. As the analytical calculations show, the internally coupled ears increase the directional response, appearing in large directional internal amplitude differences (iAD) and in large internal time differences (iTD). Numerical simulations of the eigenfunctions in an exemplary, realistically reconstructed mouth cavity further estimate the effects of its complex geometry.

  2. Analytical model of internally coupled ears.

    PubMed

    Vossen, Christine; Christensen-Dalsgaard, Jakob; van Hemmen, J Leo

    2010-08-01

    Lizards and many birds possess a specialized hearing mechanism: internally coupled ears where the tympanic membranes connect through a large mouth cavity so that the vibrations of the tympanic membranes influence each other. This coupling enhances the phase differences and creates amplitude differences in the tympanic membrane vibrations. Both cues show strong directionality. The work presented herein sets out the derivation of a three dimensional analytical model of internally coupled ears that allows for calculation of a complete vibration profile of the membranes. The analytical model additionally provides the opportunity to incorporate the effect of the asymmetrically attached columella, which leads to the activation of higher membrane vibration modes. Incorporating this effect, the analytical model can explain measurements taken from the tympanic membrane of a living lizard, for example, data demonstrating an asymmetrical spatial pattern of membrane vibration. As the analytical calculations show, the internally coupled ears increase the directional response, appearing in large directional internal amplitude differences (iAD) and in large internal time differences (iTD). Numerical simulations of the eigenfunctions in an exemplary, realistically reconstructed mouth cavity further estimate the effects of its complex geometry. PMID:20707461

  3. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed.

  4. Tools for the study of dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Zhang, Fan

    This thesis covers a range of topics in numerical and analytical relativity, centered around introducing tools and methodologies for the study of dynamical spacetimes. The scope of the studies is limited to classical (as opposed to quantum) vacuum spacetimes described by Einstein's general theory of relativity. The numerical works presented here are carried out within the Spectral Einstein Code (SpEC) infrastructure, while analytical calculations extensively utilize Wolfram's Mathematica program. We begin by examining highly dynamical spacetimes such as binary black hole mergers, which can be investigated using numerical simulations. However, there are difficulties in interpreting the output of such simulations. One difficulty stems from the lack of a canonical coordinate system (henceforth referred to as gauge freedom) and tetrad, against which quantities such as Newman-Penrose Psi4 (usually interpreted as the gravitational wave part of curvature) should be measured. We tackle this problem in Chapter 2 by introducing a set of geometrically motivated coordinates that are independent of the simulation gauge choice, as well as a quasi-Kinnersley tetrad, also invariant under gauge changes in addition to being optimally suited to the task of gravitational wave extraction. Another difficulty arises from the need to condense the overwhelming amount of data generated by the numerical simulations. In order to extract physical information in a succinct and transparent manner, one may define a version of gravitational field lines and field strength using spatial projections of the Weyl curvature tensor. Introduction, investigation and utilization of these quantities will constitute the main content in Chapters 3 through 6. For the last two chapters, we turn to the analytical study of a simpler dynamical spacetime, namely a perturbed Kerr black hole. We will introduce in Chapter 7 a new analytical approximation to the quasi-normal mode (QNM) frequencies, and relate various

  5. Tools for Authentication

    SciTech Connect

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  6. The Case for Assessment Analytics

    ERIC Educational Resources Information Center

    Ellis, Cath

    2013-01-01

    Learning analytics is a relatively new field of inquiry and its precise meaning is both contested and fluid (Johnson, Smith, Willis, Levine & Haywood, 2011; LAK, n.d.). Ferguson (2012) suggests that the best working definition is that offered by the first Learning Analytics and Knowledge (LAK) conference: "the measurement, collection,…

  7. [Photonic crystals for analytical chemistry].

    PubMed

    Chen, Yi; Li, Jincheng

    2009-09-01

    Photonic crystals, originally created to control the transmission of light, have found their increasing value in the field of analytical chemistry and are probable to become a hot research area soon. This review is hence composed, focusing on their analytical chemistry-oriented applications, including especially their use in chromatography, capillary- and chip-based electrophoresis.

  8. Information Theory in Analytical Chemistry.

    ERIC Educational Resources Information Center

    Eckschlager, Karel; Stepanek, Vladimir

    1982-01-01

    Discusses information theory in analytical practice. Topics include information quantities; ways of obtaining formulas for the amount of information in structural, qualitative, and trace analyses; and information measures in comparing and optimizing analytical methods and procedures. Includes tables outlining applications of information theory to…

  9. Current practices in clinical analytics: a hospital survey report.

    PubMed

    Womack, Dana M; Kennedy, Rosemary; Bria, Bill

    2012-01-01

    Clinical analytics must become a pervasive activity in healthcare settings to achieve the global vision for timely, effective, equitable, and excellent care. Global adoption of the Electronic Health Record (EHR) has increased the volume of data available for performance measurement and healthcare organizational capacity for continuous quality improvement. However, EHR adoption does not automatically result in optimal use of clinical data for performance improvement. In order to understand organizational factors related to use of data for clinical analytics, a survey was conducted of hospitals and hospital-based clinics. The survey revealed sub-optimal use of data captured as a byproduct of care delivery, the need for tools and methodologies to assist with data analytics, and the need for disciplined organizational structure and strategies. Informatics nurse professionals are well-positioned to lead analytical efforts and serve as a catalyst in their facility's transformations into a data-driven organization.

  10. Paper-based inkjet-printed microfluidic analytical devices.

    PubMed

    Yamada, Kentaro; Henares, Terence G; Suzuki, Koji; Citterio, Daniel

    2015-04-27

    Rapid, precise, and reproducible deposition of a broad variety of functional materials, including analytical assay reagents and biomolecules, has made inkjet printing an effective tool for the fabrication of microanalytical devices. A ubiquitous office device as simple as a standard desktop printer with its multiple ink cartridges can be used for this purpose. This Review discusses the combination of inkjet printing technology with paper as a printing substrate for the fabrication of microfluidic paper-based analytical devices (μPADs), which have developed into a fast-growing new field in analytical chemistry. After introducing the fundamentals of μPADs and inkjet printing, it touches on topics such as the microfluidic patterning of paper, tailored arrangement of materials, and functionalities achievable exclusively by the inkjet deposition of analytical assay components, before concluding with an outlook on future perspectives.

  11. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  12. Analytical solution of the simplified spherical harmonics equations in spherical turbid media

    NASA Astrophysics Data System (ADS)

    Edjlali, Ehsan; Bérubé-Lauzière, Yves

    2016-10-01

    We present for the first time an analytical solution for the simplified spherical harmonics equations (so-called SPN equations) in the case of a steady-state isotropic point source inside a spherical homogeneous absorbing and scattering medium. The SPN equations provide a reliable approximation to the radiative transfer equation for describing light transport inside turbid media. The SPN equations consist of a set of coupled partial differential equations and the eigen method is used to obtain a set of decoupled equations, each resembling the heat equation in the Laplace domain. The equations are solved for the realistic partial reflection boundary conditions accounting for the difference in refractive indices between the turbid medium and its environment (air) as occurs in practical cases of interest in biomedical optics. Specifically, we provide the complete solution methodology for the SP3, which is readily applicable to higher orders as well, and also give results for the SP5. This computationally easy to obtain solution is investigated for different optical properties of the turbid medium. For validation, the solution is also compared to the analytical solution of the diffusion equation and to gold standard Monte Carlo simulation results. The SP3 and SP5 analytical solutions prove to be in good agreement with the Monte Carlo results. This work provides an additional tool for validating numerical solutions of the SPN equations for curved geometries.

  13. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  14. Visual analytics for multimodal social network analysis: a design study with social scientists.

    PubMed

    Ghani, Sohaib; Kwon, Bum Chul; Lee, Seungyoon; Yi, Ji Soo; Elmqvist, Niklas

    2013-12-01

    Social network analysis (SNA) is becoming increasingly concerned not only with actors and their relations, but also with distinguishing between different types of such entities. For example, social scientists may want to investigate asymmetric relations in organizations with strict chains of command, or incorporate non-actors such as conferences and projects when analyzing coauthorship patterns. Multimodal social networks are those where actors and relations belong to different types, or modes, and multimodal social network analysis (mSNA) is accordingly SNA for such networks. In this paper, we present a design study that we conducted with several social scientist collaborators on how to support mSNA using visual analytics tools. Based on an openended, formative design process, we devised a visual representation called parallel node-link bands (PNLBs) that splits modes into separate bands and renders connections between adjacent ones, similar to the list view in Jigsaw. We then used the tool in a qualitative evaluation involving five social scientists whose feedback informed a second design phase that incorporated additional network metrics. Finally, we conducted a second qualitative evaluation with our social scientist collaborators that provided further insights on the utility of the PNLBs representation and the potential of visual analytics for mSNA. PMID:24051769

  15. ASSESS (Analytic System and Software for Evaluating Safeguards and Security) update: Current status and future developments

    SciTech Connect

    Al-Ayat, R.A. ); Cousins, T.D. ); Hoover, E.R. )

    1990-07-15

    The Analytic System and Software for Evaluating Safeguards and Security (ASSESS) has been released for use by DOE field offices and their contractors. In October, 1989, we offered a prototype workshop to selected representatives of the DOE community. Based on the prototype results, we held the first training workshop at the Central Training Academy in January, 1990. Four additional workshops are scheduled for FY 1990. ASSESS is a state-of-the-art analytical tool for management to conduct integrated evaluation of safeguards systems at facilities handling facilities. Currently, ASSESS focuses on the threat of theft/diversion of special nuclear material by insiders, outsiders, and a special form of insider/outsider collusion. ASSESS also includes a neutralization module. Development of the tool is continuing. Plans are underway to expand the capabilities of ASSESS to evaluate against violent insiders, to validate the databases, to expand the neutralization module, and to assist in demonstrating compliance with DOE Material Control and Accountability (MC A) Order 5633.3. These new capabilities include the ability to: compute a weighted average for performance capability against a spectrum of insider adversaries; conduct defense-in-depth analyses; and analyze against protracted theft scenarios. As they become available, these capabilities will be incorporated in our training program. ASSESS is being developed jointly by Lawrence Livermore and Sandia National Laboratories under the sponsorship of the Department of Energy (DOE) Office of Safeguards and Security.

  16. Chemiluminescence microarrays in analytical chemistry: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2014-09-01

    Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.

  17. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  18. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  19. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    SciTech Connect

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.; Riensche, Roderick M.; Franklin, Lyndsey; Pike, William A.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analytical components from information sources making it easier to adapt the framework for many different data repositories.

  20. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  1. Avionics System Architecture Tool

    NASA Technical Reports Server (NTRS)

    Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian

    2005-01-01

    Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.

  2. Health informatics and analytics - building a program to integrate business analytics across clinical and administrative disciplines.

    PubMed

    Tremblay, Monica Chiarini; Deckard, Gloria J; Klein, Richard

    2016-07-01

    Health care organizations must develop integrated health information systems to respond to the numerous government mandates driving the movement toward reimbursement models emphasizing value-based and accountable care. Success in this transition requires integrated data analytics, supported by the combination of health informatics, interoperability, business process design, and advanced decision support tools. This case study presents the development of a master's level cross- and multidisciplinary informatics program offered through a business school. The program provides students from diverse backgrounds with the knowledge, leadership, and practical application skills of health informatics, information systems, and data analytics that bridge the interests of clinical and nonclinical professionals. This case presents the actions taken and challenges encountered in navigating intra-university politics, specifying curriculum, recruiting the requisite interdisciplinary faculty, innovating the educational format, managing students with diverse educational and professional backgrounds, and balancing multiple accreditation agencies. PMID:27274022

  3. Health informatics and analytics - building a program to integrate business analytics across clinical and administrative disciplines.

    PubMed

    Tremblay, Monica Chiarini; Deckard, Gloria J; Klein, Richard

    2016-07-01

    Health care organizations must develop integrated health information systems to respond to the numerous government mandates driving the movement toward reimbursement models emphasizing value-based and accountable care. Success in this transition requires integrated data analytics, supported by the combination of health informatics, interoperability, business process design, and advanced decision support tools. This case study presents the development of a master's level cross- and multidisciplinary informatics program offered through a business school. The program provides students from diverse backgrounds with the knowledge, leadership, and practical application skills of health informatics, information systems, and data analytics that bridge the interests of clinical and nonclinical professionals. This case presents the actions taken and challenges encountered in navigating intra-university politics, specifying curriculum, recruiting the requisite interdisciplinary faculty, innovating the educational format, managing students with diverse educational and professional backgrounds, and balancing multiple accreditation agencies.

  4. Additional Types of Neuropathy

    MedlinePlus

    ... A A Listen En Español Additional Types of Neuropathy Charcot's Joint Charcot's Joint, also called neuropathic arthropathy, ... can stop bone destruction and aid healing. Cranial Neuropathy Cranial neuropathy affects the 12 pairs of nerves ...

  5. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  6. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  7. Analytical Chemistry Laboratory progress report for FY 1985

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.

  8. THE FUTURE OF SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): 2006-2010

    EPA Science Inventory

    SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...

  9. Visualizing Flow of Uncertainty through Analytical Processes.

    PubMed

    Wu, Yingcai; Yuan, Guo-Xun; Ma, Kwan-Liu

    2012-12-01

    Uncertainty can arise in any stage of a visual analytics process, especially in data-intensive applications with a sequence of data transformations. Additionally, throughout the process of multidimensional, multivariate data analysis, uncertainty due to data transformation and integration may split, merge, increase, or decrease. This dynamic characteristic along with other features of uncertainty pose a great challenge to effective uncertainty-aware visualization. This paper presents a new framework for modeling uncertainty and characterizing the evolution of the uncertainty information through analytical processes. Based on the framework, we have designed a visual metaphor called uncertainty flow to visually and intuitively summarize how uncertainty information propagates over the whole analysis pipeline. Our system allows analysts to interact with and analyze the uncertainty information at different levels of detail. Three experiments were conducted to demonstrate the effectiveness and intuitiveness of our design.

  10. An insight-based longitudinal study of visual analytics.

    PubMed

    Saraiya, Purvi; North, Chris; Lam, Vy; Duca, Karen A

    2006-01-01

    Visualization tools are typically evaluated in controlled studies that observe the short-term usage of these tools by participants on preselected data sets and benchmark tasks. Though such studies provide useful suggestions, they miss the long-term usage of the tools. A longitudinal study of a bioinformatics data set analysis is reported here. The main focus of this work is to capture the entire analysis process that an analyst goes through from a raw data set to the insights sought from the data. The study provides interesting observations about the use of visual representations and interaction mechanisms provided by the tools, and also about the process of insight generation in general. This deepens our understanding of visual analytics, guides visualization developers in creating more effective visualization tools in terms of user requirements, and guides evaluators in designing future studies that are more representative of insights sought by users from their data sets.

  11. Process Mapping: Tools, Techniques, & Critical Success Factors.

    ERIC Educational Resources Information Center

    Kalman, Howard K.

    2002-01-01

    Explains process mapping as an analytical tool and a process intervention that performance technologists can use to improve human performance by reducing error variance. Highlights include benefits of process mapping; and critical success factors, including organizational readiness, time commitment by participants, and the availability of a…

  12. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn M.; Nowell, Lisa H.

    2012-01-01

    compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1

  13. Design intent tool: User guide

    SciTech Connect

    Mills, Evan; Abell, Daniel; Bell, Geoffrey; Faludi, Jeremy; Greenberg, Steve; Hitchcock, Rob; Piette, Mary Ann; Sartor, Dalei; Stum, Karl

    2002-08-23

    This database tool provides a structured approach to recording design decisions that impact a facility's design intent in areas such as energy efficiency.Owners and de signers alike can plan, monitor and verify that a facility's design intent is being met during each stage of the design process. Additionally, the Tool gives commissioning agents, facility operators and future owners and renovators an understanding of how the building and its subsystems are intended to operate, and thus track and benchmark performance.

  14. A pilot bridging data integration and analytics: BioMediator and R?

    PubMed

    Jeng, S; Wang, K; Barbero, J; Brinkley, J; Tarczy-Hornoch, P

    2005-01-01

    Biological research today involves aggregating and analyzing large amounts of data from disparate sources. Tools such as the University of Washington's BioMediator system integrate heterogeneous data. Analytic packages such as the R environment have a rich set of tools to analyze biomedical research data. Our pilot project bridged data integration and analytics in a general way by successfully incorporating the BioMediator system into the R platform for specific analyses on neurophysiologic research data.

  15. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  16. Airport vulnerability assessment: an analytical approach

    NASA Astrophysics Data System (ADS)

    Lazarick, Richard T.

    1998-12-01

    The Airport Vulnerability Assessment Project (AVAP) is the direct result of congressional funding of recommendation 3.13 of the White House Commission on Aviation Safety and Security. This project takes a new approach to the assessment of U.S. commercial airports. AVAP uses automation, analytical methods and tools to evaluate vulnerability and risk, and to analyze cost/benefits in a more quantitative manner. This paper addresses both the process used to conduct this program, as well as a generalized look at the results, which have been achieved for the initial airport assessments. The process description covers the acquisition approach, the project structure, and a review of the various methodologies and tools being used by the sever performing organizations (Abacus Technology, Battelle, CTI, Lockwood Greene, Naval Facilities Engineering Service Center, SAIC, and Science & Engineering Associates). The tools described include ASSESS, SAM, RiskWatch, CASRAP, and AVAT. Included in the process is the utilization of an advisory panel made up predominantly of experts from the National Laboratories 9Sandia, Oak Ridge, Argonne and Brookhaven). The results portion addresses the findings and products resulting from the initial airport assessments. High level (unrestricted) summaries of the results are presented, along with initial trends in commonly recommended security improvements (countermeasures). Opportunities for the application of optics technology are identified.

  17. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  18. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  19. Hanford performance evaluation program for Hanford site analytical services

    SciTech Connect

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ``quality is achieved and maintained by those who have been assigned the responsibility for performing the work.`` Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A.

  20. Sharpening the health policy analytical rapier Comment on "The politics and analytics of health policy"

    PubMed Central

    Powell, Martin

    2014-01-01

    This commentary on the Editorial ‘The politics and analytics of health policy’ by Professor Calum Paton focuses on two issues. First, it points to the unclear links between ideas, ideology, values, and discourse and policy, and warns that discourse is often a poor guide to enacted policy. Second, it suggests that realism, particularly ‘programme theory’ are useful tools for health policy analysis. ‘Market reform’ cannot be reduced to a simple ‘four legs good, two legs bad’ verdict, and programme theory might suggest that certain mechanisms may be good for one outcome in a particular context, but bad for another. PMID:24847488