Science.gov

Sample records for additional analytical tools

  1. Analytic tools for information warfare

    SciTech Connect

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  2. Analytical Web Tool for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  3. Additive manufacturing of tools for lapping glass

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2013-09-01

    Additive manufacturing technologies have the ability to directly produce parts with complex geometries without the need for secondary processes, tooling or fixtures. This ability was used to produce concave lapping tools with a VFlash 3D printer from 3D Systems. The lapping tools were first designed in Creo Parametric with a defined constant radius and radial groove pattern. The models were converted to stereolithography files which the VFlash used in building the parts, layer by layer, from a UV curable resin. The tools were rotated at 60 rpm and used with 120 grit and 220 grit silicon carbide lapping paste to lap 0.750" diameter fused silica workpieces. The samples developed a matte appearance on the lapped surface that started as a ring at the edge of the workpiece and expanded to the center. This indicated that as material was removed, the workpiece radius was beginning to match the tool radius. The workpieces were then cleaned and lapped on a second tool (with equivalent geometry) using a 3000 grit corundum aluminum oxide lapping paste, until a near specular surface was achieved. By using lapping tools that have been additively manufactured, fused silica workpieces can be lapped to approach a specified convex geometry. This approach may enable more rapid lapping of near net shape workpieces that minimize the material removal required by subsequent polishing. This research may also enable development of new lapping tool geometry and groove patterns for improved loose abrasive finishing.

  4. Aptamers: molecular tools for analytical applications.

    PubMed

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review. PMID:17581746

  5. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  6. ANALYTICAL TOOL DEVELOPMENT FOR AFTERTREATMENT SUB-SYSTEMS INTEGRATION

    SciTech Connect

    Bolton, B; Fan, A; Goney, K; Pavlova-MacKinnon, Z; Sisken, K; Zhang, H

    2003-08-24

    The stringent emissions standards of 2007 and beyond require complex engine, aftertreatment and vehicle systems with a high degree of sub-system interaction and flexible control solutions. This necessitates a system-based approach to technology development, in addition to individual sub-system optimization. Analytical tools can provide an effective means to evaluate and develop such complex technology interactions as well as understand phenomena that is either too expensive or impossible to study with conventional experimental means. The analytical effort can also guide experimental development and thus lead to efficient utilization of available experimental resources.A suite of analytical models has been developed to represent PM and NOx aftertreatment sub-systems. These models range from computationally inexpensive zero-dimensional models for real-time control applications to CFD-based, multi-dimensional models with detailed temporal and spatial resolution. Such models in conjunction with well established engine modeling tools such as engine cycle simulation, engine controls modeling, CFD models of non-combusting and combusting flow, and vehicle models provide a comprehensive analytical toolbox for complete engine, aftertreatment and vehicle sub-systems development and system integration applications. However, the fidelity of aftertreatment models and application going forward is limited by the lack of fundamental kinetic data.

  7. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  8. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  9. Analytical Tools Interface for Landscape Assessments

    EPA Science Inventory

    Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...

  10. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  11. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA)

    EPA Science Inventory

    The Landscape Ecology Branch in cooperation with U.S. EPA Region 4 and TVA have developed a user friendly interface to facilitate with Geographic Information Systems (GIS). GIS have become a powerful tool in the field of landscape ecology. A common application of GIS is the gener...

  12. Guidance for the Design and Adoption of Analytic Tools.

    SciTech Connect

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  13. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  14. Analytical Tools for Cloudscope Ice Measurement

    NASA Technical Reports Server (NTRS)

    Arnott, W. Patrick

    1998-01-01

    The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that

  15. Using Visual Analytics Tool for Improving Data Comprehension

    ERIC Educational Resources Information Center

    Géryk, Jan

    2015-01-01

    The efficacy of animated data visualizations in comparison with static data visualizations is still inconclusive. Some researches resulted that the failure to find out the benefits of animations may relate to the way how they are constructed and perceived. In this paper, we present visual analytics (VA) tool which makes use of enhanced animated…

  16. Analytical Modelling Of Milling For Tool Design And Selection

    SciTech Connect

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-05-17

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools.

  17. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  18. Visual-Analytics Tools for Analyzing Polymer Conformational Dynamics

    NASA Astrophysics Data System (ADS)

    Thakur, Sidharth; Tallury, Syamal; Pasquinelli, Melissa

    2010-03-01

    The goal of this work is to supplement existing methods for analyzing spatial-temporal dynamics of polymer conformations derived from molecular dynamics simulations by adapting standard visual-analytics tools. We intend to use these tools to quantify conformational dynamics and chemical characteristics at interfacial domains, and correlate this information to the macroscopic properties of a material. Our approach employs numerical measures of similarities and provides matrix- and graph-based representations of the similarity relationships for the polymer structures. We will discuss some numerical measures that encapsulate geometric and spatial attributes of polymer molecular configurations. These methods supply information on global and local relationships between polymer conformations, which can be used to inspect important characteristics of stable and persistent polymer conformations in specific environments. Initially, we have applied these tools to investigate the interface in polymer nanocomposites between a polymer matrix and carbon nanotube reinforcements and to correlate this information to the macroscopic properties of the material. The results indicate that our visual-analytic approach can be used to compare spatial dynamics of rigid and non-rigid polymers and properties of families of related polymers.

  19. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  20. Ultrafast 2D NMR: an emerging tool in analytical spectroscopy.

    PubMed

    Giraudeau, Patrick; Frydman, Lucio

    2014-01-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry--from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  1. Ultrafast 2D NMR: An Emerging Tool in Analytical Spectroscopy

    NASA Astrophysics Data System (ADS)

    Giraudeau, Patrick; Frydman, Lucio

    2014-06-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry—from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications.

  2. Network Analytical Tool for Monitoring Global Food Safety Highlights China

    PubMed Central

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P.

    2009-01-01

    Background The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. Methodology/Principal Findings We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003 – August 2008 were processed using network analysis to i) capture complexity, ii) analyze trends, and iii) predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i) Google's PageRank algorithm and ii) the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. Conclusions/Significance This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios. PMID:19688088

  3. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  4. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  5. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    PubMed

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results. PMID:27251852

  6. MATRIICES - Mass Analytical Tool for Reactions in Interstellar ICES

    NASA Astrophysics Data System (ADS)

    Isokoski, K.; Bossa, J. B.; Linnartz, H.

    2011-05-01

    The formation of complex organic molecules (COMs) observed in the inter- and circumstellar medium (ISCM) is driven by a complex chemical network yet to be fully characterized. Interstellar dust grains and the surrounding ice mantles, subject to atom bombardment, UV irradiation, and thermal processing, are believed to provide catalytic sites for such chemistry. However, the solid state chemical processes and the level of complexity reachable under astronomical conditions remain poorly understood. The conventional laboratory techniques used to characterize the solid state reaction pathways - RAIRS (Reflection Absorption IR Spectroscopy) and TPD (Temperature-Programmed Desorption) - are suitable for the analysis of reactions in ices made of relatively small molecules. For more complex ices comprising a series of different components as relevant to the interstellar medium, spectral overlapping prohibits unambiguous identification of reaction schemes, and these techniques start to fail. Therefore, we have constructed a new and innovative experimental set up for the study of complex interstellar ices featuring a highly sensitive and unambiguous detection method. MATRIICES (Mass Analytical Tool for Reactions in Interstellar ICES) combines Laser Ablation technique with a molecular beam experiment and Time-Of-Flight Mass Spectrometry (LA-TOF-MS) to sample and analyze the ice analogues in situ, at native temperatures, under clean ultra-high vacuum conditions. The method allows direct sampling and analysis of the ice constituents in real time, by using a pulsed UV ablation laser (355-nm Nd:YAG) to vaporize the products in a MALDI-TOF like detection scheme. The ablated material is caught in a synchronously pulsed molecular beam of inert carrier gas (He) from a supersonic valve, and analysed in a Reflectron Time-of-Flight Mass Spectrometer. The detection limit of the method is expected to exceed that of the regular surface techniques substantially. The ultimate goal is to fully

  7. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  8. Supporting student nurses in practice with additional online communication tools.

    PubMed

    Morley, Dawn A

    2014-01-01

    Student nurses' potential isolation and difficulties of learning on placement have been well documented and, despite attempts to make placement learning more effective, evidence indicates the continuing schism between formal learning at university and situated learning on placement. First year student nurses, entering placement for the first time, are particularly vulnerable to the vagaries of practice. During 2012 two first year student nurse seminar groups (52 students) were voluntarily recruited for a mixed method study to determine the usage of additional online communication support mechanisms (Facebook, wiki, an email group and traditional methods of support using individual email or phone) while undertaking their first five week clinical placement. The study explores the possibility of strengthening clinical learning and support by promoting the use of Web 2.0 support groups for student nurses. Results indicate a high level of interactivity in both peer and academic support in the use of Facebook and a high level of interactivity in one wiki group. Students' qualitative comments voice an appreciation of being able to access university and peer support whilst working individually on placement. Recommendations from the study challenge universities to use online communication tools already familiar to students to complement the support mechanisms that exist for practice learning. This is tempered by recognition of the responsibility of academics to ensure their students are aware of safe and effective online communication. PMID:23871299

  9. Individual Development and Latent Groups: Analytical Tools for Interpreting Heterogeneity

    ERIC Educational Resources Information Center

    Thomas, H.; Dahlin, M.P.

    2005-01-01

    Individual differences in development or growth are typically handled under conventional analytical approaches by blocking on the variables thought to contribute to variation, such as sex or age. But such approaches fail when the differences are attributable to latent characteristics (i.e., variables not directly observable beforehand) within the…

  10. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  11. Analytical tools for characterizing biopharmaceuticals and the implications for biosimilars

    PubMed Central

    Berkowitz, Steven A.; Engen, John R.; Mazzeo, Jeffrey R.; Jones, Graham B.

    2013-01-01

    Biologics such as monoclonal antibodies are much more complex than small-molecule drugs, which raises challenging questions for the development and regulatory evaluation of follow-on versions of such biopharmaceutical products (also known as biosimilars) and their clinical use once patent protection for the pioneering biologic has expired. With the recent introduction of regulatory pathways for follow-on versions of complex biologics, the role of analytical technologies in comparing biosimilars with the corresponding reference product is attracting substantial interest in establishing the development requirements for biosimilars. Here, we discuss the current state of the art in analytical technologies to assess three characteristics of protein biopharmaceuticals that regulatory authorities have identified as being important in development strategies for biosimilars: post-translational modifications, three-dimensional structures and protein aggregation. PMID:22743980

  12. Analytical tools for the analysis of β-carotene and its degradation products.

    PubMed

    Stutz, H; Bresgen, N; Eckl, P M

    2015-05-01

    β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation

  13. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  14. Single cell analytic tools for drug discovery and development

    PubMed Central

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  15. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...

  16. Analytic hierarchy process (AHP) as a tool in asset allocation

    NASA Astrophysics Data System (ADS)

    Zainol Abidin, Siti Nazifah; Mohd Jaffar, Maheran

    2013-04-01

    Allocation capital investment into different assets is the best way to balance the risk and reward. This can prevent from losing big amount of money. Thus, the aim of this paper is to help investors in making wise investment decision in asset allocation. This paper proposes modifying and adapting Analytic Hierarchy Process (AHP) model. The AHP model is widely used in various fields of study that are related in decision making. The results of the case studies show that the proposed model can categorize stocks and determine the portion of capital investment. Hence, it can assist investors in decision making process and reduce the risk of loss in stock market investment.

  17. Using decision analytic methods to assess the utility of family history tools.

    PubMed

    Tyagi, Anupam; Morris, Jill

    2003-02-01

    Family history may be a useful tool for identifying people at increased risk of disease and for developing targeted interventions for individuals at higher-than-average risk. This article addresses the issue of how to examine the utility of a family history tool for public health and preventive medicine. We propose the use of a decision analytic framework for the assessment of a family history tool and outline the major elements of a decision analytic approach, including analytic perspective, costs, outcome measurements, and data needed to assess the value of a family history tool. We describe the use of sensitivity analysis to address uncertainty in parameter values and imperfect information. To illustrate the use of decision analytic methods to assess the value of family history, we present an example analysis based on using family history of colorectal cancer to improve rates of colorectal cancer screening. PMID:12568827

  18. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  19. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  20. Raman microspectroscopy: a powerful analytic and imaging tool in petrology and geochemistry

    NASA Astrophysics Data System (ADS)

    Beyssac, O.

    2013-12-01

    Raman microspectroscopy is a vibrational spectroscopy based on the inelastic scattering of light interacting with molecules. This technique has benefited from recent developments in spectral and spatial resolution as well as sensitivity which make it widely used in Geosciences. A very attractive aspect of Raman spectroscopy is that it does not require any complex sample preparation. In addition, Raman imaging is now a routine and reliable technique which makes it competitive with SEM-EDS mapping for mineral mapping for instance. Raman microspectroscopy is a complementary technique to SEM, EMP, SIMS... as it can provide not only information on mineral chemistry, but overall on mineral structure. Raman Microspectroscopy is for instance the best in situ technique to distinguish mineral polymorphs. In addition the sensitivity of RM to mineral structure is extremely useful to study accessory minerals like oxides or sulphides as well as graphitic carbons. A brief presentation of the analytical capabilities of modern Raman spectroscopy will be presented. Then recent applications of RM in petrological and geochemical problems will be reviewed, including Raman imaging. The advantages and disadvantages of this technique compared to other micro-analytic tools will be discussed.

  1. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  2. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  3. Electron Microscopy: an Analytical Tool for Solid State Physicists

    NASA Astrophysics Data System (ADS)

    van Tendeloo, Gustaaf

    2013-03-01

    For too long the electron microscope has been considered as ``a big magnifying glass.'' Modern electron microscopy however has evolved into an analytical technique, able to provide quantitative data on structure, composition, chemical bonding and magnetic properties. Using lens corrected instruments it is now possible to determine atom shifts at interfaces with a precision of a few picometer; chemical diffusion at these interfaces can be imaged down to atomic scale. The chemical nature of the surface atoms can be visualized and even the bonding state of the elements (e.g. Mn2+ versus Mn3+) can be detected on an atomic scale. Electron microscopy is by principle a projection technique, but the final dream is to obtain atomic info of materials in three dimensions. We will show that this is no longer a dream, but that it is possible using advanced microscopy. We will show evidence of determining the valence change Ce4+ versus Ce3+ at the surface of a CeO2 nanocrystal; the atomic shifts at the interface between LaAlO3 and SrTiO3 and the 3D relaxation of a Au nanocrystal.

  4. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  5. Cryogenic analytical tools for LHe distribution system design

    SciTech Connect

    Johnson, R.R.

    1983-07-29

    The two computer programs presented in this paper are both fundamentally general in that they could be applied to other magnet systems. In addition to MFTF-B analyses, these programs will be used on current and future GDC superconducting magnet projects. Future extended capabilities will include transient heating and flow conditions for THERMOSIPHON and multiple magnet quench features for MAGPRS.

  6. categoryCompare, an analytical tool based on feature annotations

    PubMed Central

    Flight, Robert M.; Harrison, Benjamin J.; Mohammad, Fahim; Bunge, Mary B.; Moon, Lawrence D. F.; Petruska, Jeffrey C.; Rouchka, Eric C.

    2014-01-01

    Assessment of high-throughput—omics data initially focuses on relative or raw levels of a particular feature, such as an expression value for a transcript, protein, or metabolite. At a second level, analyses of annotations including known or predicted functions and associations of each individual feature, attempt to distill biological context. Most currently available comparative- and meta-analyses methods are dependent on the availability of identical features across data sets, and concentrate on determining features that are differentially expressed across experiments, some of which may be considered “biomarkers.” The heterogeneity of measurement platforms and inherent variability of biological systems confounds the search for robust biomarkers indicative of a particular condition. In many instances, however, multiple data sets show involvement of common biological processes or signaling pathways, even though individual features are not commonly measured or differentially expressed between them. We developed a methodology, categoryCompare, for cross-platform and cross-sample comparison of high-throughput data at the annotation level. We assessed the utility of the approach using hypothetical data, as well as determining similarities and differences in the set of processes in two instances: (1) denervated skin vs. denervated muscle, and (2) colon from Crohn's disease vs. colon from ulcerative colitis (UC). The hypothetical data showed that in many cases comparing annotations gave superior results to comparing only at the gene level. Improved analytical results depended as well on the number of genes included in the annotation term, the amount of noise in relation to the number of genes expressing in unenriched annotation categories, and the specific method in which samples are combined. In the skin vs. muscle denervation comparison, the tissues demonstrated markedly different responses. The Crohn's vs. UC comparison showed gross similarities in inflammatory

  7. More Analytical Tools for Fluids Management in Space

    NASA Astrophysics Data System (ADS)

    Weislogel, Mark

    Continued advances during the 2000-2010 decade in the analysis of a class of capillary-driven flows relevant to materials processing and fluids management aboard spacecraft have been made. The class of flows addressed concern combined forced and spontaneous capillary flows in complex containers with interior edges. Such flows are commonplace in space-based fluid systems and arise from the particular container geometry and wetting properties of the system. Important applications for this work include low-g liquid fill and/or purge operations and passive fluid phase separation operations, where the container (i.e. fuel tank, water processer, etc.) geometry possesses interior edges, and where quantitative information of fluid location, transients, flow rates, and stability is critical. Examples include the storage and handling of liquid propellants and cryogens, water conditioning for life support, fluid phase-change thermal systems, materials processing in the liquid state, on-orbit biofluids processing, among others. For a growing number of important problems, closed-form expressions to transient three-dimensional flows are possible that, as design tools, replace difficult, time-consuming, and rarely performed numerical calculations. An overview of a selection of solutions in-hand is presented with example problems solved. NASA drop tower, low-g aircraft, and ISS flight ex-periment results are employed where practical to buttress the theoretical findings. The current review builds on a similar review presented at COSPAR, 2002, for the approximate decade 1990-2000.

  8. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    SciTech Connect

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  9. Non invasive ventilation as an additional tool for exercise training.

    PubMed

    Ambrosino, Nicolino; Cigni, Paolo

    2015-01-01

    Recently, there has been increasing interest in the use of non invasive ventilation (NIV) to increase exercise capacity. In individuals with COPD, NIV during exercise reduces dyspnoea and increases exercise tolerance. Different modalities of mechanical ventilation have been used non-invasively as a tool to increase exercise tolerance in COPD, heart failure and lung and thoracic restrictive diseases. Inspiratory support provides symptomatic benefit by unloading the ventilatory muscles, whereas Continuous Positive Airway Pressure (CPAP) counterbalances the intrinsic positive end-expiratory pressure in COPD patients. Severe stable COPD patients undergoing home nocturnal NIV and daytime exercise training showed some benefits. Furthermore, it has been reported that in chronic hypercapnic COPD under long-term ventilatory support, NIV can also be administered during walking. Despite these results, the role of NIV as a routine component of pulmonary rehabilitation is still to be defined. PMID:25874110

  10. The RESET tephra database and associated analytical tools

    NASA Astrophysics Data System (ADS)

    Bronk Ramsey, Christopher; Housley, Rupert A.; Lane, Christine S.; Smith, Victoria C.; Pollard, A. Mark

    2015-06-01

    An open-access database has been set up to support the research project studying the 'Response of Humans to Abrupt Environmental Transitions' (RESET). The main methodology underlying this project was to use tephra layers to tie together and synchronise the chronologies of stratigraphic records at archaeological and environmental sites. The database has information on occurrences, and chemical compositions, of glass shards from tephra and cryptotephra deposits found across Europe. The data includes both information from the RESET project itself and from the published literature. With over 12,000 major element analyses and over 3000 trace element analyses on glass shards, relevant to 80 late Quaternary eruptions, the RESET project has generated an important archive of data. When added to the published information, the database described here has a total of more than 22,000 major element analyses and nearly 4000 trace element analyses on glass from over 240 eruptions. In addition to the database and its associated data, new methods of data analysis for assessing correlations have been developed as part of the project. In particular an approach using multi-dimensional kernel density estimates to evaluate the likelihood of tephra compositions matching is described here and tested on data generated as part of the RESET project.

  11. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    SciTech Connect

    Parkerton, T.F.; Stone, M.A.

    1995-12-31

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications.

  12. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  13. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  14. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  15. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  16. The Use of Economic Analytical Tools in Quantifying and Measuring Educational Benefits and Costs.

    ERIC Educational Resources Information Center

    Holleman, I. Thomas, Jr.

    The general objective of this study was to devise quantitative guidelines that school officials can accurately follow in using benefit-cost analysis, cost-effectiveness analysis, ratio analysis, and other similar economic analytical tools in their particular local situations. Specifically, the objectives were to determine guidelines for the…

  17. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION PLAN Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying out their responsibilities for implementing the Plan, the Corps of Engineers, the South Florida...

  18. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  19. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  20. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  1. Designing a Collaborative Visual Analytics Tool for Social and Technological Change Prediction.

    SciTech Connect

    Wong, Pak C.; Leung, Lai-Yung R.; Lu, Ning; Scott, Michael J.; Mackey, Patrick S.; Foote, Harlan P.; Correia, James; Taylor, Zachary T.; Xu, Jianhua; Unwin, Stephen D.; Sanfilippo, Antonio P.

    2009-09-01

    We describe our ongoing efforts to design and develop a collaborative visual analytics tool to interactively model social and technological change of our society in a future setting. The work involves an interdisciplinary team of scientists from atmospheric physics, electrical engineering, building engineering, social sciences, economics, public policy, and national security. The goal of the collaborative tool is to predict the impact of global climate change on the U.S. power grids and its implications for society and national security. These future scenarios provide critical assessment and information necessary for policymakers and stakeholders to help formulate a coherent, unified strategy toward shaping a safe and secure society. The paper introduces the problem background and related work, explains the motivation and rationale behind our design approach, presents our collaborative visual analytics tool and usage examples, and finally shares the development challenge and lessons learned from our investigation.

  2. Volume, Variety and Veracity of Big Data Analytics in NASA's Giovanni Tool

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Hegde, M.; Smit, C.; Pan, J.; Bryant, K.; Chidambaram, C.; Zhao, P.

    2013-12-01

    Earth Observation data have posed challenges to NASA users ever since the launch of several satellites around the turn of the century, generating volumes now measured in petabytes, a volume growth further increased by models assimilating the satellite data. One important approach to bringing Big Data Analytic capabilities to bear on the Volume of data has been the provision of server-side analysis capabilities. For instance, the Geospatial Interactive Online Visualization ANd aNalysis (Giovanni) tool provides a web interface to large volumes of gridded data from several EOSDIS data centers. Giovanni's main objective is to allow the user to explore its data holdings using various forms of visualization and data summarization or aggregation algorithms, thus allowing the user to examine statistics and pictures for the overall data, while eventually acquiring only the most useful data. Thus much of the preprocessing and data reduction aspects can take place on the server, delivering manageable information quantities to the user. In addition to Volume, Giovanni uses open standards to tackle the Variety aspect of Big Data, incorporating data stored in several formats, from several data centers, and making them available in a uniform data format and structure to both the Giovanni algorithms and the end user. The Veracity aspect of Big Data, perhaps the stickiest of wickets, is enhanced through features that enable reproducibility (provenance and URL-driven workflows), and by a Help Desk staffed by scientists with expertise in the science data.

  3. The impact of layer thickness on the performance of additively manufactured lapping tools

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2015-10-01

    Lower cost additive manufacturing (AM) machines which have emerged in recent years are capable of producing tools, jigs, and fixtures that are useful in optical fabrication. In particular, AM tooling has been shown to be useful in lapping glass workpieces. Various AM machines are distinguished by the processes, materials, build times, and build resolution they provide. This research investigates the impact of varied build resolution (specifically layer resolution) on the lapping performance of tools built using the stereolithographic assembly (SLA) process in 50 μm and 100 μm layer thicknesses with a methacrylate photopolymer resin on a high resolution desktop printer. As with previous work, the lapping tools were shown to remove workpiece material during the lapping process, but the tools themselves also experienced significant wear on the order of 2-3 times the mass loss of the glass workpieces. The tool wear rates for the 100 μm and 50 μm layer tools were comparable, but the 50 μm layer tool was 74% more effective at removing material from the glass workpiece, which is attributed to some abrasive particles being trapped in the coarser surface of the 100 um layer tooling and not being available to interact with the glass workpiece. Considering the tool wear, these additively manufactured tools are most appropriate for prototype tooling where the low cost (<$45) and quick turnaround make them attractive when compared to a machined tool.

  4. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits. PMID:27038058

  5. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  6. An integrated analytic tool and knowledge-based system approach to aerospace electric power system control

    NASA Astrophysics Data System (ADS)

    Owens, William R.; Henderson, Eric; Gandikota, Kapal

    1986-10-01

    Future aerospace electric power systems require new control methods because of increasing power system complexity, demands for power system management, greater system size and heightened reliability requirements. To meet these requirements, a combination of electric power system analytic tools and knowledge-based systems is proposed. The continual improvement in microelectronic performance has made it possible to envision the application of sophisticated electric power system analysis tools to aerospace vehicles. These tools have been successfully used in the measurement and control of large terrestrial electric power systems. Among these tools is state estimation which has three main benefits. The estimator builds a reliable database for the system structure and states. Security assessment and contingency evaluation also require a state estimator. Finally, the estimator will, combined with modern control theory, improve power system control and stability. Bad data detection as an adjunct to state estimation identifies defective sensors and communications channels. Validated data from the analytic tools is supplied to a number of knowledge-based systems. These systems will be responsible for the control, protection, and optimization of the electric power system.

  7. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  8. Investigation of helium addition for laser-induced plasma spectroscopy of pure gas phase systems: Analyte interactions and signal enhancement

    NASA Astrophysics Data System (ADS)

    Henry, C. A.; Diwakar, P. K.; Hahn, D. W.

    2007-12-01

    The role of helium addition on the analyte signal enhancement in laser-induced breakdown spectroscopy for analysis of pure gaseous systems was examined using carbon and hydrogen atomic emission lines. Increased analyte response, as measured by peak-to-base and signal-to-noise ratios, was observed with increasing helium addition, with maximum enhancement approaching a factor of 7. Additional measurements revealed a significant decrease in plasma electron density with increasing helium addition. To explore the mechanisms of analyte signal enhancement, the helium emission lines were also examined and found to be effectively quenched with nitrogen addition. In consideration of the data, it is concluded that the role of metastable helium is not as important as the overall changes in plasma properties, namely electron density and laser-plasma coupling. Helium addition is concluded to affect the electron density via Penning ionization, as well as to play a role in the initial plasma breakdown processes.

  9. Analytical optimal controls for the state constrained addition and removal of cryoprotective agents

    PubMed Central

    Chicone, Carmen C.; Critser, John K.

    2014-01-01

    Cryobiology is a field with enormous scientific, financial and even cultural impact. Successful cryopreservation of cells and tissues depends on the equilibration of these materials with high concentrations of permeating chemicals (CPAs) such as glycerol or 1,2 propylene glycol. Because cells and tissues are exposed to highly anisosmotic conditions, the resulting gradients cause large volume fluctuations that have been shown to damage cells and tissues. On the other hand, there is evidence that toxicity to these high levels of chemicals is time dependent, and therefore it is ideal to minimize exposure time as well. Because solute and solvent flux is governed by a system of ordinary differential equations, CPA addition and removal from cells is an ideal context for the application of optimal control theory. Recently, we presented a mathematical synthesis of the optimal controls for the ODE system commonly used in cryobiology in the absence of state constraints and showed that controls defined by this synthesis were optimal. Here we define the appropriate model, analytically extend the previous theory to one encompassing state constraints, and as an example apply this to the critical and clinically important cell type of human oocytes, where current methodologies are either difficult to implement or have very limited success rates. We show that an enormous increase in equilibration efficiency can be achieved under the new protocols when compared to classic protocols, potentially allowing a greatly increased survival rate for human oocytes, and pointing to a direction for the cryopreservation of many other cell types. PMID:22527943

  10. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  11. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  12. Ion mobility spectrometry as a high-throughput analytical tool in occupational pyrethroid exposure.

    PubMed

    Armenta, S; Blanco, M

    2012-08-01

    The capabilities of ion mobility spectrometry (IMS) as a high throughput and green analytical tool in the occupational health and safety control, using pyrethroids as models has been evidenced. The method used for dermal and inhalation exposure assessment is based on the passive pyrethroid sampling using Teflon membranes, direct thermal extraction of the pyrethroids, and measurement of the vaporized analytes by IMS without reagent and solvent consumption. The IMS signatures of the studied synthetic pyrethroids under atmospheric pressure chemical ionization by investigating the formed negative ion products have been obtained. The main advantages of the proposed procedure are related to the obtained limits of detection, ranging from 0.08 to 5 ng, the simplicity of measurement, the lack of sample treatment, and therefore, solvent consumption and waste generation, and finally, the speed of analysis. PMID:22159370

  13. Generalized net analyte signal standard addition as a novel method for simultaneous determination: application in spectrophotometric determination of some pesticides.

    PubMed

    Asadpour-Zeynali, Karim; Saeb, Elhameh; Vallipour, Javad; Bamorowat, Mehdi

    2014-01-01

    Simultaneous spectrophotometric determination of three neonicotinoid insecticides (acetamiprid, imidacloprid, and thiamethoxam) by a novel method named generalized net analyte signal standard addition method (GNASSAM) in some binary and ternary synthetic mixtures was investigated. For this purpose, standard addition was performed using a single standard solution consisting of a mixture of standards of all analytes. Savings in time and amount of used materials are some of the advantages of this method. All determinations showed appropriate applicability of this method with less than 5% error. This method may be applied for linearly dependent data in the presence of known interferents. The GNASSAM combines the advantages of both the generalized standard addition method and net analyte signal; therefore, it may be a proper alternative for some other multivariate methods. PMID:24672886

  14. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  15. Rhodobase, a meta-analytical tool for reconstructing gene regulatory networks in a model photosynthetic bacterium.

    PubMed

    Moskvin, Oleg V; Bolotin, Dmitry; Wang, Andrew; Ivanov, Pavel S; Gomelsky, Mark

    2011-02-01

    We present Rhodobase, a web-based meta-analytical tool for analysis of transcriptional regulation in a model anoxygenic photosynthetic bacterium, Rhodobacter sphaeroides. The gene association meta-analysis is based on the pooled data from 100 of R. sphaeroides whole-genome DNA microarrays. Gene-centric regulatory networks were visualized using the StarNet approach (Jupiter, D.C., VanBuren, V., 2008. A visual data mining tool that facilitates reconstruction of transcription regulatory networks. PLoS ONE 3, e1717) with several modifications. We developed a means to identify and visualize operons and superoperons. We designed a framework for the cross-genome search for transcription factor binding sites that takes into account high GC-content and oligonucleotide usage profile characteristic of the R. sphaeroides genome. To facilitate reconstruction of directional relationships between co-regulated genes, we screened upstream sequences (-400 to +20bp from start codons) of all genes for putative binding sites of bacterial transcription factors using a self-optimizing search method developed here. To test performance of the meta-analysis tools and transcription factor site predictions, we reconstructed selected nodes of the R. sphaeroides transcription factor-centric regulatory matrix. The test revealed regulatory relationships that correlate well with the experimentally derived data. The database of transcriptional profile correlations, the network visualization engine and the optimized search engine for transcription factor binding sites analysis are available at http://rhodobase.org. PMID:21070832

  16. High temperature dielectric constant measurement - another analytical tool for ceramic studies?

    SciTech Connect

    Hutcheon, R.M.; Hayward, P.; Alexander, S.B.

    1995-12-31

    The automation of a high-temperature (1400{degrees}C), microwave-frequency, dielectric constant measurement system has dramatically increased the reproducibility and detail of data. One can now consider using the technique as a standard tool for analytical studies of low-conductivity ceramics and glasses. Simultaneous temperature and frequency scanning dielectric analyses (SDA) yield the temperature-dependent complex dielectric constant. The real part of the dielectric constant is especially sensitive to small changes in the distance and distribution of neighboring ions or atoms, while the absorptive part is strongly dependent on the position and population of electron/hole conduction bands, which are sensitive to impurity concentrations in the ceramic. SDA measurements on a few specific materials will be compared with standard differential thermal analysis (DTA) results and an attempt will be made to demonstrate the utility of both the common and complementary aspects of the techniques.

  17. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    PubMed

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  18. STRMDEPL08 - An Extended Version of STRMDEPL with Additional Analytical Solutions to Calculate Streamflow Depletion by Nearby Pumping Wells

    USGS Publications Warehouse

    Reeves, Howard W.

    2008-01-01

    STRMDEPL, a one-dimensional model using two analytical solutions to calculate streamflow depletion by a nearby pumping well, was extended to account for two additional analytical solutions. The extended program is named STRMDEPL08. The original program incorporated solutions for a stream that fully penetrates the aquifer with and without streambed resistance to ground-water flow. The modified program includes solutions for a partially penetrating stream with streambed resistance and for a stream in an aquitard subjected to pumping from an underlying leaky aquifer. The code also was modified to allow the user to input pumping variations at other than 1-day intervals. The modified code is shown to correctly evaluate the analytical solutions and to provide correct results for half-day time intervals.

  19. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  20. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    PubMed

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. PMID:26304440

  1. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  2. Freeform manufacturing of a progressive addition lens by use of a voice coil fast tool servo

    NASA Astrophysics Data System (ADS)

    Li, Yi Yu; Chen, Jiao Jie; Feng, Hai Hua; Li, Chaohong; Qu, Jia; Chen, Hao

    2014-08-01

    The back surface of progressive addition lens (PAL) is a non-rotationally symmetric freeform surface. The local radius varies progressively from the far zone to the near zone along the intermediate zone to give the addition power. Numerical simulation method is performed to calculate the discrete points on the freeform surface in polar coordinate and generate the data files containing the trajectory of diamond tool tip for surface machining. The fabrication of PAL is accomplished by using self-developed single-point diamond turning machine with voice coil fast tool servo. The polished freeform surface profile measured by a 3-axes coordinate measuring machine shows little deviation to the simulation result. Surface power and cylinder of the fabricated PAL is also measured for comparison with theoretical design.

  3. H-point standard additions method for simultaneous determination of sulfamethoxazole and trimethoprim in pharmaceutical formulations and biological fluids with simultaneous addition of two analytes

    NASA Astrophysics Data System (ADS)

    Givianrad, M. H.; Saber-Tehrani, M.; Aberoomand-Azar, P.; Mohagheghian, M.

    2011-03-01

    The applicability of H-point standard additions method (HPSAM) to the resolving of overlapping spectra corresponding to the sulfamethoxazole and trimethoprim is verified by UV-vis spectrophotometry. The results show that the H-point standard additions method with simultaneous addition of both analytes is suitable for the simultaneous determination of sulfamethoxazole and trimethoprim in aqueous media. The results of applying the H-point standard additions method showed that the two drugs could be determined simultaneously with the concentration ratios of sulfamethoxazole to trimethoprim varying from 1:18 to 16:1 in the mixed samples. Also, the limits of detections were 0.58 and 0.37 μmol L -1 for sulfamethoxazole and trimethoprim, respectively. In addition the means of the calculated RSD (%) were 1.63 and 2.01 for SMX and TMP, respectively in synthetic mixtures. The proposed method has been successfully applied to the simultaneous determination of sulfamethoxazole and trimethoprim in some synthetic, pharmaceutical formulation and biological fluid samples.

  4. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  5. Experimental model and analytic solution for real-time observation of vehicle's additional steer angle

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolong; Li, Liang; Pan, Deng; Cao, Chengmao; Song, Jian

    2014-03-01

    The current research of real-time observation for vehicle roll steer angle and compliance steer angle(both of them comprehensively referred as the additional steer angle in this paper) mainly employs the linear vehicle dynamic model, in which only the lateral acceleration of vehicle body is considered. The observation accuracy resorting to this method cannot meet the requirements of vehicle real-time stability control, especially under extreme driving conditions. The paper explores the solution resorting to experimental method. Firstly, a multi-body dynamic model of a passenger car is built based on the ADAMS/Car software, whose dynamic accuracy is verified by the same vehicle's roadway test data of steady static circular test. Based on this simulation platform, several influencing factors of additional steer angle under different driving conditions are quantitatively analyzed. Then ɛ-SVR algorithm is employed to build the additional steer angle prediction model, whose input vectors mainly include the sensor information of standard electronic stability control system(ESC). The method of typical slalom tests and FMVSS 126 tests are adopted to make simulation, train model and test model's generalization performance. The test result shows that the influence of lateral acceleration on additional steer angle is maximal (the magnitude up to 1°), followed by the longitudinal acceleration-deceleration and the road wave amplitude (the magnitude up to 0.3°). Moreover, both the prediction accuracy and the calculation real-time of the model can meet the control requirements of ESC. This research expands the accurate observation methods of the additional steer angle under extreme driving conditions.

  6. Applying stable isotopes to examine food-web structure: an overview of analytical tools.

    PubMed

    Layman, Craig A; Araujo, Marcio S; Boucek, Ross; Hammerschlag-Peyer, Caroline M; Harrison, Elizabeth; Jud, Zachary R; Matich, Philip; Rosenblatt, Adam E; Vaudo, Jeremy J; Yeager, Lauren A; Post, David M; Bearhop, Stuart

    2012-08-01

    Stable isotope analysis has emerged as one of the primary means for examining the structure and dynamics of food webs, and numerous analytical approaches are now commonly used in the field. Techniques range from simple, qualitative inferences based on the isotopic niche, to Bayesian mixing models that can be used to characterize food-web structure at multiple hierarchical levels. We provide a comprehensive review of these techniques, and thus a single reference source to help identify the most useful approaches to apply to a given data set. We structure the review around four general questions: (1) what is the trophic position of an organism in a food web?; (2) which resource pools support consumers?; (3) what additional information does relative position of consumers in isotopic space reveal about food-web structure?; and (4) what is the degree of trophic variability at the intrapopulation level? For each general question, we detail different approaches that have been applied, discussing the strengths and weaknesses of each. We conclude with a set of suggestions that transcend individual analytical approaches, and provide guidance for future applications in the field. PMID:22051097

  7. Net analyte signal standard addition method for simultaneous determination of sulphadiazine and trimethoprim in bovine milk and veterinary medicines.

    PubMed

    Hajian, Reza; Mousavi, Esmat; Shams, Nafiseh

    2013-06-01

    Net analyte signal standard addition method has been used for the simultaneous determination of sulphadiazine and trimethoprim by spectrophotometry in some bovine milk and veterinary medicines. The method combines the advantages of standard addition method with the net analyte signal concept which enables the extraction of information concerning a certain analyte from spectra of multi-component mixtures. This method has some advantages such as the use of a full spectrum realisation, therefore it does not require calibration and prediction step and only a few measurements require for the determination. Cloud point extraction based on the phenomenon of solubilisation used for extraction of sulphadiazine and trimethoprim in bovine milk. It is based on the induction of micellar organised media by using Triton X-100 as an extraction solvent. At the optimum conditions, the norm of NAS vectors increased linearly with concentrations in the range of 1.0-150.0 μmolL(-1) for both sulphadiazine and trimethoprim. The limits of detection (LOD) for sulphadiazine and trimethoprim were 0.86 and 0.92 μmolL(-1), respectively. PMID:23411170

  8. Net analyte signal standard addition method (NASSAM) as a novel spectrofluorimetric and spectrophotometric technique for simultaneous determination, application to assay of melatonin and pyridoxine

    NASA Astrophysics Data System (ADS)

    Asadpour-Zeynali, Karim; Bastami, Mohammad

    2010-02-01

    In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.

  9. Common plants as alternative analytical tools to monitor heavy metals in soil

    PubMed Central

    2012-01-01

    Background Herbaceous plants are common vegetal species generally exposed, for a limited period of time, to bioavailable environmental pollutants. Heavy metals contamination is the most common form of environmental pollution. Herbaceous plants have never been used as natural bioindicators of environmental pollution, in particular to monitor the amount of heavy metals in soil. In this study, we aimed at assessing the usefulness of using three herbaceous plants (Plantago major L., Taraxacum officinale L. and Urtica dioica L.) and one leguminous (Trifolium pratense L.) as alternative indicators to evaluate soil pollution by heavy metals. Results We employed Inductively Coupled Plasma Atomic Emission Spectroscopy (ICP-AES) to assess the concentration of selected heavy metals (Cu, Zn, Mn, Pb, Cr and Pd) in soil and plants and we employed statistical analyses to describe the linear correlation between the accumulation of some heavy metals and selected vegetal species. We found that the leaves of Taraxacum officinale L. and Trifolium pratense L. can accumulate Cu in a linearly dependent manner with Urtica dioica L. representing the vegetal species accumulating the highest fraction of Pb. Conclusions In this study we demonstrated that common plants can be used as an alternative analytical tool for monitoring selected heavy metals in soil. PMID:22594441

  10. Sugar Maple Pigments Through the Fall and the Role of Anthocyanin as an Analytical Tool

    NASA Astrophysics Data System (ADS)

    Lindgren, E.; Rock, B.; Middleton, E.; Aber, J.

    2008-12-01

    Sugar maple habitat is projected to almost disappear in future climate scenarios. In fact, many institutions state that these trees are already in decline. Being able to detect sugar maple health could prove to be a useful analytical tool to monitor changes in phenology. Anthocyanin, a red pigment found in sugar maples, is thought to be a universal indicator of plant stress. It is very prominent in the spring during the first flush of leaves, as well as in the fall as leaves senesce. Determining an anthocyanin index that could be used with satellite systems will provide a greater understanding of tree phenology and the distribution of plant stress, both over large areas as well as changes over time. The utilization of anthocyanin for one of it's functions, prevention of oxidative stress, may fluctuate in response to changing climatic conditions that occur during senescence or vary from year to year. By monitoring changes in pigment levels and antioxidant capacity through the fall, one may be able to draw conclusions about the ability to detect anthocyanin remotely from space-based systems, and possibly determine a more specific function for anthocyanin during fall senescence. These results could then be applied to track changes in tree stress.

  11. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    PubMed

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. PMID:25016590

  12. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). PMID:26873463

  13. An analytical tool that quantifies cellular morphology changes from three-dimensional fluorescence images.

    PubMed

    Haass-Koffler, Carolina L; Naeemuddin, Mohammad; Bartlett, Selena E

    2012-01-01

    The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology even in complex tissue sections. Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic

  14. Basic Conceptual Systems (BCSs)--Tools for Analytic Coding, Thinking and Learning: A Concept Teaching Curriculum in Norway

    ERIC Educational Resources Information Center

    Hansen, Andreas

    2009-01-01

    The role of basic conceptual systems (for example, colour, shape, size, position, direction, number, pattern, etc.) as psychological tools for analytic coding, thinking, learning is emphasised, and a proposal for a teaching order of BCSs in kindergarten and primary school is introduced. The first part of this article explains briefly main aspects…

  15. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  16. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    PubMed

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products. PMID:27394712

  17. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    SciTech Connect

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  18. Evaluation of Exogenous siRNA Addition as a Metabolic Engineering Tool for Modifying Biopharmaceuticals

    PubMed Central

    Tummala, Seshu; Titus, Michael; Wilson, Lee; Wang, Chunhua; Ciatto, Carlo; Foster, Donald; Szabo, Zoltan; Guttman, Andras; Li, Chen; Bettencourt, Brian; Jayaraman, Muthuswamy; Deroot, Jack; Thill, Greg; Kocisko, David; Pollard, Stuart; Charisse, Klaus; Kuchimanchi, Satya; Hinkle, Greg; Milstein, Stuart; Myers, Rachel; Wu, Shiaw-Lin; Karger, Barry; Rossomando, Anthony

    2012-01-01

    Traditional metabolic engineering approaches, including homologous recombination, zinc finger nucleases, and short hairpin RNA (shRNA), have previously been employed to generate biologics with specific characteristics that improve efficacy, potency, and safety. An alternative approach is to exogenously add soluble small interfering RNA (siRNA) duplexes, formulated with a cationic lipid, directly to cells grown in shake flasks or bioreactors, This approach has the following potential advantages : no cell line development required, ability to tailor mRNA silencing by adjusting siRNA concentration, simultaneous silencing of multiple target genes, and potential temporal control of down regulation of target gene expression. In this study, we demonstrate proof of concept of the siRNA feeding approach as a metabolic engineering tool in the context of increasing monoclonal antibody afucosylation. First, potent siRNA duplexes targeting fut8 and gmds were dosed into shake flasks with cells that express an anti-CD20 monoclonal antibody. Dose response studies demonstrated the ability to titrate the silencing effect. Furthermore, siRNA addition resulted in no deleterious effects on cell growth, final protein titer, or specific productivity. In bioreactors, antibodies produced by cells following siRNA treatment exhibited improved functional characteristics compared to antibodies from untreated cells, including increased levels of afucosylation (63%), a 17-fold improvement in FCgRIIIa binding, and an increase in specific cell lysis by up to 30%, as determined in an ADCC assay. In addition, standard purification procedures effectively cleared the exogenously added siRNA and transfection agent. Moreover, no differences were observed when other key product quality structural attributes were compared to untreated controls. These results establish that exogenous addition of siRNA represents a potentially novel metabolic engineering tool to improve biopharmaceutical function and

  19. Non-invasive tools for measuring metabolism and biophysical analyte transport: self-referencing physiological sensing.

    PubMed

    McLamore, Eric S; Porterfield, D Marshall

    2011-11-01

    Biophysical phenomena related to cellular biochemistry and transport are spatially and temporally dynamic, and are directly involved in the regulation of physiology at the sub-cellular to tissue spatial scale. Real time monitoring of transmembrane transport provides information about the physiology and viability of cells, tissues, and organisms. Combining information learned from real time transport studies with genomics and proteomics allows us to better understand the functional and mechanistic aspects of cellular and sub-cellular systems. To accomplish this, ultrasensitive sensing technologies are required to probe this functional realm of biological systems with high temporal and spatial resolution. In addition to ongoing research aimed at developing new and enhanced sensors (e.g., increased sensitivity, enhanced analyte selectivity, reduced response time, and novel microfabrication approaches), work over the last few decades has advanced sensor utility through new sensing modalities that extend and enhance the data recorded by sensors. A microsensor technique based on phase sensitive detection of real time biophysical transport is reviewed here. The self-referencing technique converts non-invasive extracellular concentration sensors into dynamic flux sensors for measuring transport from the membrane to the tissue scale. In this tutorial review, we discuss the use of self-referencing micro/nanosensors for measuring physiological activity of living cells/tissues in agricultural, environmental, and biomedical applications comprehensible to any scientist/engineer. PMID:21761069

  20. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  1. Repurposing mainstream CNC machine tools for laser-based additive manufacturing

    NASA Astrophysics Data System (ADS)

    Jones, Jason B.

    2016-04-01

    The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.

  2. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    NASA Astrophysics Data System (ADS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-03-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)SJNCAS0038-5506][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)SPHJAR0038-5646] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)EPCFFB1434-604410.1140/epjc/s10052-009-1195-8][M. M. Block, Eur. Phys. J. C 68, 683 (2010)EPCFFB1434-604410.1140/epjc/s10052-010-1374-7] allow us to write fully decoupled solutions for the singlet structure function Fs(x,Q2) and G(x,Q2) as Fs(x,Q2)=Fs(Fs0(x0),G0(x0)) and G(x,Q2)=G(Fs0(x0),G0(x0)), where the x0 are the Bjorken x values at Q02. Here Fs and G are known functions—found using LO DGLAP splitting functions—of the initial boundary conditions Fs0(x)≡Fs(x,Q02) and G0(x)≡G(x,Q02), i.e., the chosen starting functions at the virtuality Q02. For both G(x) and Fs(x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy—a computational fractional precision of O(10-9). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet Fs distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)EPCFFB1434-604410.1140/epjc/s10052-009-1072-5], starting from their initial values at Q02=1GeV2 and 1.69GeV2, respectively, using their choice of αs(Q2). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and Fs satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of

  3. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  4. Multi-site study of additive genetic effects on fractional anisotropy of cerebral white matter: comparing meta and mega analytical approaches for data pooling

    PubMed Central

    Kochunov, Peter; Jahanshad, Neda; Sprooten, Emma; Nichols, Thomas E.; Mandl, René C.; Almasy, Laura; Booth, Tom; Brouwer, Rachel M.; Curran, Joanne E.; de Zubicaray, Greig I.; Dimitrova, Rali; Duggirala, Ravi; Fox, Peter T.; Hong, L. Elliot; Landman, Bennett A.; Lemaitre, Hervé; Lopez, Lorna; Martin, Nicholas G.; McMahon, Katie L.; Mitchell, Braxton D.; Olvera, Rene L.; Peterson, Charles P.; Starr, John M.; Sussmann, Jessika E.; Toga, Arthur W.; Wardlaw, Joanna M.; Wright, Margaret J.; Wright, Susan N.; Bastin, Mark E.; McIntosh, Andrew M.; Boomsma, Dorret I.; Kahn, René S.; den Braber, Anouk; de Geus, Eco JC; Deary, Ian J.; Hulshoff Pol, Hilleke E.; Williamson, Douglas E.; Blangero, John; van ’t Ent, Dennis; Thompson, Paul M.; Glahn, David C.

    2014-01-01

    Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9–85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large “mega-family”. We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability. PMID:24657781

  5. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  6. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  7. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy.

    PubMed

    Tang, Bang-Cheng; Cai, Chen-Bo; Shi, Wei; Xu, Lu

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  8. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    PubMed Central

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  9. Selecting the Right Tool: Comparison of the Analytical Performance of Infrared Attenuated Total Reflection Accessories.

    PubMed

    Schädle, Thomas; Mizaikoff, Boris

    2016-06-01

    The analytical performance of four commercially available infrared attenuated total reflection (IR-ATR) accessories with various ATR waveguide materials has been analyzed and evaluated using acetate, CO2, and CO3 (2-) solutions. Calibration functions have been established to determine and compare analytically relevant parameters such as sensitivity, signal-to-noise ratio (SNR), and efficiency. The obtained parameters were further analyzed to support conclusions on the differences in performance of the individual IR-ATR accessories. PMID:27091901

  10. Usefulness of anterior uveitis as an additional tool for diagnosing incomplete Kawasaki disease

    PubMed Central

    Lee, Kyu Jin; Kim, Hyo Jin; Kim, Min Jae; Yoon, Ji Hong; Lee, Eun Jung; Lee, Jae Young; Oh, Jin Hee; Lee, Soon Ju; Lee, Kyung Yil

    2016-01-01

    Purpose There are no specific tests for diagnosing Kawasaki disease (KD). Additional diagnostic criteria are needed to prevent the delayed diagnosis of incomplete Kawasaki disease (IKD). This study compared the frequency of coronary artery lesions (CALs) in IKD patients with and without anterior uveitis (AU) and elucidated whether the finding of AU supported the diagnosis of IKD. Methods This study enrolled patients diagnosed with IKD at The Catholic University of Korea, Uijeongbu St. Mary's Hospital from January 2010 to December 2014. The patients were divided into 2 groups: group 1 included patients with IKD having AU; and group 2 included patients with IKD without AU. We analyzed the demographic and clinical data (age, gender, duration of fever, and the number of diagnostic criteria), laboratory results, and echocardiographic findings. Results Of 111 patients with IKD, 41 had uveitis (36.98%, group 1) and 70 did not (63.02%, group 2). Patients in group 1 had received a diagnosis and treatment earlier, and had fewer CALs (3 of 41, 1.7%) than those in group 2 (20 of 70, 28.5%) (P=0.008). All 3 patients with CALs in group 1 had coronary dilatation, while patients with CALs in group 2 had CALs ranging from coronary dilatation to giant aneurysm. Conclusion The diagnosis of IKD is challenging but can be supported by the presence of features such as AU. Group 1 had a lower risk of coronary artery disease than group 2. Therefore, the presence of AU is helpful in the early diagnosis and treatment of IKD and can be used as an additional diagnostic tool. PMID:27186227

  11. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    SciTech Connect

    D.W. Hayden

    2005-02-01

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried to develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of

  12. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  13. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  14. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  15. Analytical ultracentrifugation: A versatile tool for the characterisation of macromolecular complexes in solution.

    PubMed

    Patel, Trushar R; Winzor, Donald J; Scott, David J

    2016-02-15

    Analytical ultracentrifugation, an early technique developed for characterizing quantitatively the solution properties of macromolecules, remains a powerful aid to structural biologists in their quest to understand the formation of biologically important protein complexes at the molecular level. Treatment of the basic tenets of the sedimentation velocity and sedimentation equilibrium variants of analytical ultracentrifugation is followed by considerations of the roles that it, in conjunction with other physicochemical procedures, has played in resolving problems encountered in the delineation of complex formation for three biological systems - the cytoplasmic dynein complex, mitogen-activated protein kinase (ERK2) self-interaction, and the terminal catalytic complex in selenocysteine synthesis. PMID:26555086

  16. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  17. The Facial Aesthetic index: An additional tool for assessing treatment need

    PubMed Central

    Sundareswaran, Shobha; Ramakrishnan, Ranjith

    2016-01-01

    Objectives: Facial Aesthetics, a major consideration in orthodontic diagnosis and treatment planning, may not be judged correctly and completely by simply analyzing dental occlusion or osseous structures. Despite this importance, there is no index to guarantee availability of treatment or prioritize patients based on their soft tissue treatment needs. Individuals having well-aligned teeth but unaesthetic convex profiles do not get included for treatment as per current malocclusion indices. The aim of this investigation is to develop an aesthetic index based on facial profiles which could be used as an additional tool with malocclusion indices. Materials and Methods: A chart showing typical facial profile changes due to underlying malocclusions was generated by soft tissue manipulations of standardized profile photographs of a well-balanced male and female face. A panel of 62 orthodontists judged the profile photographs of 100 patients with different soft tissue patterns for assessing profile variations and treatment need. The index was later tested in a cross-section of school population. Statistical analysis was done using “irr” package of R environment version 2.15.1. Results: The index exhibited very good reliability in determining profile variations (Fleiss kappa 0.866, P < 0.001), excellent reproducibility (kappa 0.9078), high sensitivity, and specificity (95.7%). Testing in population yielded excellent agreement among orthodontists (kappa 0.9286). Conclusions: A new Facial Aesthetic index, based on patient's soft tissue profile requirements is proposed, which can complement existing indices to ensure treatment to those in need. PMID:27127752

  18. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  19. The use of analytical surface tools in the fundamental study of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    This paper reviews the various techniques and surface tools available for the study of the atomic nature of the wear of materials. These include chemical etching, X-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which effect wear such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  20. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  1. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  2. Analytical Tools To Distinguish the Effects of Localization Error, Confinement, and Medium Elasticity on the Velocity Autocorrelation Function

    PubMed Central

    Weber, Stephanie C.; Thompson, Michael A.; Moerner, W.E.; Spakowitz, Andrew J.; Theriot, Julie A.

    2012-01-01

    Single particle tracking is a powerful technique for investigating the dynamic behavior of biological molecules. However, many of the analytical tools are prone to generate results that can lead to mistaken interpretations of the underlying transport process. Here, we explore the effects of localization error and confinement on the velocity autocorrelation function, Cυ. We show that calculation of Cυ across a range of discretizations can distinguish the effects of localization error, confinement, and medium elasticity. Thus, under certain regimes, Cυ can be used as a diagnostic tool to identify the underlying mechanism of anomalous diffusion. Finally, we apply our analysis to experimental data sets of chromosomal loci and RNA-protein particles in Escherichia coli. PMID:22713559

  3. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  4. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  5. Electrochemical treatment of olive mill wastewater: treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools.

    PubMed

    Belaid, Chokri; Khadraoui, Moncef; Mseddii, Salma; Kallel, Monem; Elleuch, Boubaker; Fauvarque, Jean Frangois

    2013-01-01

    Problems related with industrials effluents can be divided in two parts: (1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes. This investigation deals with these two aspects, an electrochemical treatment method of an olive mill wastewater (OMW) under platinized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination. The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution. Indeed, 87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation. Moreover, 55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced. On the other hand, UV-Visible spectrophotometry, Gaz chromatography/mass spectrometry, cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR) showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW. It was concluded that electrochemical oxidation in a modified Grignard reactor is a promising process for the destruction of all phenolic compounds present in OMW. Among the monitoring analytical tools applied, cyclic voltammetry and 13C NMR a re among th e techniques that are introduced for thefirst time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance. PMID:23586318

  6. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  7. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  8. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  9. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1

  10. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2004-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its twenty-fourth month of development activities.

  11. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2004-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eighteenth month of development activities.

  12. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  13. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  14. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  15. ELISA and GC-MS as Teaching Tools in the Undergraduate Environmental Analytical Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Wilson, Ruth I.; Mathers, Dan T.; Mabury, Scott A.; Jorgensen, Greg M.

    2000-12-01

    An undergraduate experiment for the analysis of potential water pollutants is described. Students are exposed to two complementary techniques, ELISA and GC-MS, for the analysis of a water sample containing atrazine, desethylatrazine, and simazine. Atrazine was chosen as the target analyte because of its wide usage in North America and its utility for students to predict environmental degradation products. The water sample is concentrated using solid-phase extraction for GC-MS, or diluted and analyzed using a competitive ELISA test kit for atrazine. The nature of the water sample is such that students generally find that ELISA gives an artificially high value for the concentration of atrazine. Students gain an appreciation for problems associated with measuring pollutants in the aqueous environment: sensitivity, accuracy, precision, and ease of analysis. This undergraduate laboratory provides an opportunity for students to learn several new analysis and sample preparation techniques and to critically evaluate these methods in terms of when they are most useful.

  16. Towards a minimally invasive sampling tool for high resolution tissue analytical mapping

    NASA Astrophysics Data System (ADS)

    Gottardi, R.

    2015-09-01

    Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.

  17. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  18. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    SciTech Connect

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O'Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of

  19. Mineotaur: a tool for high-content microscopy screen sharing and visual analytics.

    PubMed

    Antal, Bálint; Chessel, Anatole; Carazo Salas, Rafael E

    2015-01-01

    High-throughput/high-content microscopy-based screens are powerful tools for functional genomics, yielding intracellular information down to the level of single-cells for thousands of genotypic conditions. However, accessing their data requires specialized knowledge and most often that data is no longer analyzed after initial publication. We describe Mineotaur ( http://www.mineotaur.org ), a open-source, downloadable web application that allows easy online sharing and interactive visualisation of large screen datasets, facilitating their dissemination and further analysis, and enhancing their impact. PMID:26679168

  20. Web-based analytical tools for the exploration of spatial data

    NASA Astrophysics Data System (ADS)

    Anselin, Luc; Kim, Yong Wook; Syabri, Ibnu

    This paper deals with the extension of internet-based geographic information systems with functionality for exploratory spatial data analysis (esda). The specific focus is on methods to identify and visualize outliers in maps for rates or proportions. Three sets of methods are included: extreme value maps, smoothed rate maps and the Moran scatterplot. The implementation is carried out by means of a collection of Java classes to extend the Geotools open source mapping software toolkit. The web based spatial analysis tools are illustrated with applications to the study of homicide rates and cancer rates in U.S. counties.

  1. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  2. An analytical method on the surface residual stress for the cutting tool orientation

    NASA Astrophysics Data System (ADS)

    Li, Yueen; Zhao, Jun; Wang, Wei

    2009-12-01

    The residual stress is measured by choosing 8 kinds orientations on cutting the H13 dies steel on the HSM in the experiment of this paper. The measured data shows on that the residual stress exists periodicity for the different rake angle (β) and side rake angle (θ) parameters, further study find that the cutting tool orientations have closed relationship with the residual stresses, and for the original of the machined residual stress on the surface from the cutting force and the axial force, it can be gained the simply model of tool-workpiece force, using the model it can be deduced the residual stress model, which is feasible to calculate the size of residual stress. And for almost all the measured residual stresses are compressed stress, the compressed stress size and the direction could be confirmed by the input data for the H13 on HSM. As the result, the residual stress model is the key for optimization of rake angle (β) and side rake angle (θ) in theory, using the theory the more cutting mechanism can be expressed.

  3. An analytical method on the surface residual stress for the cutting tool orientation

    NASA Astrophysics Data System (ADS)

    Li, Yueen; Zhao, Jun; Wang, Wei

    2010-03-01

    The residual stress is measured by choosing 8 kinds orientations on cutting the H13 dies steel on the HSM in the experiment of this paper. The measured data shows on that the residual stress exists periodicity for the different rake angle (β) and side rake angle (θ) parameters, further study find that the cutting tool orientations have closed relationship with the residual stresses, and for the original of the machined residual stress on the surface from the cutting force and the axial force, it can be gained the simply model of tool-workpiece force, using the model it can be deduced the residual stress model, which is feasible to calculate the size of residual stress. And for almost all the measured residual stresses are compressed stress, the compressed stress size and the direction could be confirmed by the input data for the H13 on HSM. As the result, the residual stress model is the key for optimization of rake angle (β) and side rake angle (θ) in theory, using the theory the more cutting mechanism can be expressed.

  4. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. PMID:21412782

  5. An analytical tool-box for comprehensive biochemical, structural and transcriptome evaluation of oral biofilms mediated by mutans streptococci.

    PubMed

    Klein, Marlise I; Xiao, Jin; Heydorn, Arne; Koo, Hyun

    2011-01-01

    Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition (1, 2). In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci (3). The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate (3). Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and

  6. Tools for the Quantitative Analysis of Sedimentation Boundaries Detected by Fluorescence Optical Analytical Ultracentrifugation

    PubMed Central

    Zhao, Huaying; Casillas, Ernesto; Shroff, Hari; Patterson, George H.; Schuck, Peter

    2013-01-01

    Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system. PMID:24204779

  7. DDBJ launches a new archive database with analytical tools for next-generation sequence data.

    PubMed

    Kaminuma, Eli; Mashima, Jun; Kodama, Yuichi; Gojobori, Takashi; Ogasawara, Osamu; Okubo, Kousaku; Takagi, Toshihisa; Nakamura, Yasukazu

    2010-01-01

    The DNA Data Bank of Japan (DDBJ) (http://www.ddbj.nig.ac.jp) has collected and released 1,701,110 entries/1,116,138,614 bases between July 2008 and June 2009. A few highlighted data releases from DDBJ were the complete genome sequence of an endosymbiont within protist cells in the termite gut and Cap Analysis Gene Expression tags for human and mouse deposited from the Functional Annotation of the Mammalian cDNA consortium. In this period, we started a novel user announcement service using Really Simple Syndication (RSS) to deliver a list of data released from DDBJ on a daily basis. Comprehensive visualization of a DDBJ release data was attempted by using a word cloud program. Moreover, a new archive for sequencing data from next-generation sequencers, the 'DDBJ Read Archive' (DRA), was launched. Concurrently, for read data registered in DRA, a semi-automatic annotation tool called the 'DDBJ Read Annotation Pipeline' was released as a preliminary step. The pipeline consists of two parts: basic analysis for reference genome mapping and de novo assembly and high-level analysis of structural and functional annotations. These new services will aid users' research and provide easier access to DDBJ databases. PMID:19850725

  8. Comprehensive analytical strategy for biomarker identification based on liquid chromatography coupled to mass spectrometry and new candidate confirmation tools.

    PubMed

    Mohamed, Rayane; Varesio, Emmanuel; Ivosev, Gordana; Burton, Lyle; Bonner, Ron; Hopfgartner, Gérard

    2009-09-15

    A comprehensive analytical LC-MS(/MS) platform for low weight biomarkers molecule in biological fluids is described. Two complementary retention mechanisms were used in HPLC by optimizing the chromatographic conditions for a reversed-phase column and a hydrophilic interaction chromatography column. LC separation was coupled to mass spectrometry by using an electrospray ionization operating in positive polarity mode. This strategy enables us to correctly retain and separate hydrophobic as well as polar analytes. For that purpose artificial model study samples were generated with a mixture of 38 well characterized compounds likely to be present in biofluids. The set of compounds was used as a standard aqueous mixture or was spiked into urine at different concentration levels to investigate the capability of the LC-MS(/MS) platform to detect variations across biological samples. Unsupervised data analysis by principal component analysis was performed and followed by principal component variable grouping to find correlated variables. This tool allows us to distinguish three main groups whose variables belong to (a) background ions (found in all type of samples), (b) ions distinguishing urine samples from aqueous standard and blank samples, (c) ions related to the spiked compounds. Interpretation of these groups allows us to identify and eliminate isotopes, adducts, fragments, etc. and to generate a reduced list of m/z candidates. This list is then submitted to the prototype MZSearcher software tool which simultaneously searches several lists of potential metabolites extracted from metabolomics databases (e.g., KEGG, HMDB, etc) to propose biomarker candidates. Structural confirmation of these candidates was done off-line by fraction collection followed by nanoelectrospray infusion to provide high quality MS/MS data for spectral database queries. PMID:19702294

  9. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  10. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  11. The management and exploitation of naturally light-emitting bacteria as a flexible analytical tool: A tutorial.

    PubMed

    Bolelli, L; Ferri, E N; Girotti, S

    2016-08-31

    Conventional detection of toxic contaminants on surfaces, in food, and in the environment takes time. Current analytical approaches to chemical detection can be of limited utility due to long detection times, high costs, and the need for a laboratory and trained personnel. A non-specific but easy, rapid, and inexpensive screening test can be useful to quickly classify a specimen as toxic or non toxic, so prompt appropriate measures can be taken, exactly where required. The bioluminescent bacteria-based tests meet all these characteristics. Bioluminescence methods are extremely attractive because of their high sensitivity, speed, ease of implementation, and statistical significance. They are usually sensitive enough to detect the majority of pollutants toxic to humans and mammals. This tutorial provides practical guidelines for isolating, cultivating, and exploiting marine bioluminescent bacteria as a simple and versatile analytical tool. Although mostly applied for aqueous phase sample and organic extracts, the test can also be conducted directly on soil and sediment samples so as to reflect the true toxicity due to the bioavailability fraction. Because tests can be performed with freeze-dried cell preparations, they could make a major contribution to field screening activity. They can be easily conducted in a mobile environmental laboratory and may be adaptable to miniaturized field instruments and field test kits. PMID:27506340

  12. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance. PMID:26110404

  13. Radcalc: An Analytical Tool for Shippers of Radioactive Material and Waste

    SciTech Connect

    Kapoor, A.K.; Stuhl, L.A.

    2008-07-01

    The U.S. Department of Energy (DOE) ships radioactive materials in support of its research and development, environmental restoration, and national defense activities. The Radcalc software program assists personnel working on behalf of DOE in packaging and transportation determinations (e.g., isotopic decay, decay heat, regulatory classification, and gas generation) for shipment of radioactive materials and waste. Radcalc performs: - The U.S. Department of Transportation determinations and classifications (i.e., activity concentration for exempt material Type A or B, effective A1/A2, limited quantity, low specific activity, highway route controlled quantity, fissile quantity, fissile excepted, reportable quantity, list of isotopes required on shipping papers) - DOE calculations (i.e., transuranic waste, Pu-239 equivalent curies, fissile-gram equivalents) - The U.S. Nuclear Regulatory Commission packaging category (i.e., Category I, II, or III) - Dose-equivalent curie calculations - Radioactive decay calculations using a novel decay methodology and a decay data library of 1,867 isotopes typical of the range of materials encountered in DOE laboratory environments - Hydrogen and helium gas calculations - Pressure calculations. Radcalc is a validated and cost-effective tool to provide consistency, accuracy, reproducibility, timeliness, quality, compliance, and appropriate documentation to shippers of radioactive materials and waste at DOE facilities nationwide. Hundreds of shippers and engineers throughout the DOE Complex routinely use this software to automate various determinations and to validate compliance with the regulations. The effective use of software by DOE sites contributes toward minimizing risk involved in radioactive waste shipments and assuring the safety of workers and the public. (authors)

  14. Twenty-one years of microemulsion electrokinetic chromatography (1991-2012): a powerful analytical tool.

    PubMed

    Yang, Hua; Ding, Yao; Cao, Jun; Li, Ping

    2013-05-01

    Microemulsion electrokinetic chromatography (MEEKC) is a CE separation technique, which utilizes buffered microemulsions as the separation media. In the past two decades, MEEKC has blossomed into a powerful separation technique for the analysis of a wide range of compounds. Pseudostationary phase composition is so critical to successful resolution in EKC, and several variables could be optimized including surfactant/co-surfactant/oil type and concentration, buffer content, and pH value. Additionally, MEEKC coupled with online sample preconcentration approaches could significantly improve the detection sensitivity. This review comprehensively describes the development of MEEKC from the period 1991 to 2012. Areas covered include basic theory, microemulsion composition, improving resolution and enhancing sensitivity methods, detection techniques, and applications of MEEKC. PMID:23463608

  15. Establishment of a reference collection of additives and an analytical handbook of reference data to support enforcement of EU regulations on food contact plastics.

    PubMed

    van Lierop, B; Castle, L; Feigenbaum, A; Ehlert, K; Boenke, A

    1998-10-01

    A collection has been made of additives that are required as analytical standards for enforcement of European Union legislation on food contact plastics. The 100 additives have been characterized by mass spectrometry, infra-red spectroscopy and proton nuclear magnetic resonance spectroscopy to provide reference spectra. Gas chromatographic retention times have been recorded to facilitate identification by retention index. This information has been further supplemented by physico-chemical data. Finally, chromatographic methods have been used to indicate the presence of any impurities in the commercial chemicals. Samples of the reference substances are available on request and the collection of spectra and other information will be made available in printed format and on-line through the Internet. This paper gives an overview of the work done to establish the reference collection and the spectral atlas, which together will assist enforcement laboratories in the characterization of plastics and the selection of analytical methods for additives that may migrate. PMID:10211194

  16. Mixed frequency-/time-domain coherent multidimensional spectroscopy: research tool or potential analytical method?

    PubMed

    Pakoulev, Andrei V; Rickard, Mark A; Kornau, Kathryn M; Mathew, Nathan A; Yurs, Lena A; Block, Stephen B; Wright, John C

    2009-09-15

    Coherent multidimensional spectroscopy (CMDS) is now the optical analogue of nuclear magnetic resonance (NMR). Just as NMR heteronuclear multiple-quantum coherence (HMQC) methods rely on multiple quantum coherences, achieving widespread application requires that CMDS also excites multiple quantum coherences over a wide range of quantum state energies. This Account focuses on frequency-domain CMDS because these methods tune the excitation frequencies to resonance with the desired quantum states and can form multiple quantum coherences between states with very different energies. CMDS methods use multiple excitation pulses to excite multiple quantum states within their dephasing time, so their quantum mechanical phase is maintained. Coherences formed from pairs of the excited states emit coherent beams of light. The temporal ordering of the excitation pulses defines a sequence of coherences that can result in zero, single, double, or higher order coherences as required for multiple quantum coherence CMDS. Defining the temporal ordering and the excitation frequencies and spectrally resolving the output frequency also defines a particular temporal pathway for the coherences, just as an NMR pulse sequence defines an NMR method. Two dimensional contour plots through this multidimensional parameter space allow visualization of the state energies and dynamics. This Account uses nickel and rhodium chelates as models for understanding mixed frequency-/time-domain CMDS. Mixed frequency-/time-domain methods use excitation pulse widths that are comparable to the dephasing times, so multidimensional spectra are obtained by scanning the excitation frequencies, while the coherence and population dynamics are obtained by scanning the time delays. Changing the time delays changes the peaks in the 2D excitation spectra depending upon whether the pulse sequence excites zero, single, or double quantum coherences. In addition, peaks split as a result of the frequency

  17. Use of proteolytic enzymes as an additional tool for trypanosomatid identification.

    PubMed

    Santos, A L S; Abreu, C M; Alviano, C S; Soares, R M A

    2005-01-01

    The expression of proteolytic activities in the Trypanosomatidae family was explored as a potential marker to discriminate between the morphologically indistinguishable flagellates isolated from insects and plants. We have comparatively analysed the proteolytic profiles of 19 monoxenous trypanosomatids (Herpetomonas anglusteri, H. samuelpessoai, H. mariadeanei, H. roitmani, H. muscarum ingenoplastis, H. muscarum muscarum, H. megaseliae, H. dendoderi, Herpetomoas sp., Crithidia oncopelti, C. deanei, C. acanthocephali, C. harmosa, C. fasciculata, C. guilhermei, C. luciliae, Blastocrithidia culicis, Leptomonas samueli and Lept. seymouri) and 4 heteroxenous flagellates (Phytomonas serpens, P. mcgheei, Trypanosoma cruzi and Leishmania amazonensis) by in situ detection of enzyme activities on sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE ) containing co-polymerized gelatine as substrate, in association with specific proteinase inhibitors. All 23 trypanosomatids expressed at least 1 acidic proteolytic enzyme. In addition, a characteristic and specific pattern of cell-associated metallo and/or cysteine proteinases was observed, except for the similar profiles detected in 2 Herpetomonas (H. anglusteri and H. samuelpessoai) and 3 Crithidia (C. fasciculata, C. guilhermei and C. luciliae) species. However, these flagellates released distinct secretory proteinase profiles into the extracellular medium. These findings strongly suggest that the association of cellular and secretory proteinase pattern could represent a useful marker to help trypanosomatid identification. PMID:15700759

  18. The role of methanol addition to water samples in reducing analyte adsorption and matrix effects in liquid chromatography-tandem mass spectrometry.

    PubMed

    Li, Wei; Liu, Yucan; Duan, Jinming; Saint, Christopher P; Mulcahy, Dennis

    2015-04-10

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) analysis coupled simply with water filtering before injection has proven to be a simple, economic and time-saving method for analyzing trace-level organic pollutants in aqueous environments. However, the linearity, precision and detection limits of such methods for late-eluting analytes were found to be much poorer than for early-eluting ones due to adsorption of the analytes in the operating system, such as sample vial, flow path and sample loop, creating problems in quantitative analysis. Addition of methanol (MeOH) into water samples as a modifier was shown to be effective in alleviating or even eliminating the negative effect on signal intensity for the late-eluting analytes and at the same time being able to reduce certain matrix effects for real water samples. Based on the maximum detection signal intensity obtained on desorption of the analytes with MeOH addition, the ratio of the detection signal intensity without addition of MeOH to the maximum intensity can be used to evaluate the effectiveness of methanol addition. Accordingly, the values of <50%, 50-80%, 80-120% could be used to indicate strong, medium and no effects, respectively. Based on this concept, an external matrix-matched calibration method with the addition of MeOH has been successfully established for analyzing fifteen pesticides with diverse physico-chemical properties in surface and groundwater with good linearity (r(2): 0.9929-0.9996), precision (intra-day relative standard deviation (RSD): 1.4-10.7%, inter-day RSD: 1.5-9.4%), accuracy (76.9-126.7%) and low limits of detection (0.003-0.028μg/L). PMID:25748540

  19. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  20. NBC update: The addition of viral and fungal databases to the Naïve Bayes classification tool

    PubMed Central

    2012-01-01

    Background Classifying the fungal and viral content of a sample is an important component of analyzing microbial communities in environmental media. Therefore, a method to classify any fragment from these organisms' DNA should be implemented. Results We update the näive Bayes classification (NBC) tool to classify reads originating from viral and fungal organisms. NBC classifies a fungal dataset similarly to Basic Local Alignment Search Tool (BLAST) and the Ribosomal Database Project (RDP) classifier. We also show NBC's similarities and differences to RDP on a fungal large subunit (LSU) ribosomal DNA dataset. For viruses in the training database, strain classification accuracy is 98%, while for those reads originating from sequences not in the database, the order-level accuracy is 78%, where order indicates the taxonomic level in the tree of life. Conclusions In addition to being competitive to other classifiers available, NBC has the potential to handle reads originating from any location in the genome. We recommend using the Bacteria/Archaea, Fungal, and Virus databases separately due to algorithmic biases towards long genomes. The tool is publicly available at: http://nbc.ece.drexel.edu. PMID:22293603

  1. Recent Additions in the Modeling Capabilities of an Open-Source Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-04-20

    WEC-Sim is a midfidelity numerical tool for modeling wave energy conversion devices. The code uses the MATLAB SimMechanics package to solve multibody dynamics and models wave interactions using hydrodynamic coefficients derived from frequency-domain boundary-element methods. This paper presents the new modeling features introduced in the latest release of WEC-Sim. The first feature discussed conversion of the fluid memory kernel to a state-space form. This enhancement offers a substantial computational benefit after the hydrodynamic body-to-body coefficients are introduced and the number of interactions increases exponentially with each additional body. Additional features include the ability to calculate the wave-excitation forces based on the instantaneous incident wave angle, allowing the device to weathervane, as well as import a user-defined wave elevation time series. A review of the hydrodynamic theory for each feature is provided and the successful implementation is verified using test cases.

  2. Data-infilling in daily mean river flow records: first results using a visual analytics tool (gapIT)

    NASA Astrophysics Data System (ADS)

    Giustarini, Laura; Parisot, Olivier; Ghoniem, Mohammad; Trebs, Ivonne; Médoc, Nicolas; Faber, Olivier; Hostache, Renaud; Matgen, Patrick; Otjacques, Benoît

    2015-04-01

    Missing data in river flow records represent a loss of information and a serious drawback in water management. An incomplete time series prevents the computation of hydrological statistics and indicators. Also, records with data gaps are not suitable as input or validation data for hydrological or hydrodynamic modelling. In this work we present a visual analytics tool (gapIT), which supports experts to find the most adequate data-infilling technique for daily mean river flow records. The tool performs an automated calculation of river flow estimates using different data-infilling techniques. Donor station(s) are automatically selected based on Dynamic Time Warping, geographical proximity and upstream/downstream relationships. For each gap the tool computes several flow estimates through various data-infilling techniques, including interpolation, multiple regression, regression trees and neural networks. The visual application provides the possibility for the user to select different donor station(s) w.r.t. those automatically selected. The gapIT software was applied to 24 daily time series of river discharge recorded in Luxembourg over the period 01/01/2007 - 31/12/2013. The method was validated by randomly creating artificial gaps of different lengths and positions along the entire records. Using the RMSE and the Nash-Sutcliffe (NS) coefficient as performance measures, the method is evaluated based on a comparison with the actual measured discharge values. The application of the gapIT software to artificial gaps led to satisfactory results in terms of performance indicators (NS>0.8 for more than half of the artificial gaps). A case-by-case analysis revealed that the limited number of reconstructed record gaps characterized by a high RMSE values (NS>0.8) were caused by the temporary unavailability of the most appropriate donor station. On the other hand, some of the gaps characterized by a high accuracy of the reconstructed record were filled by using the data from

  3. Additional disturbances as a beneficial tool for restoration of post-mining sites: a multi-taxa approach.

    PubMed

    Řehounková, Klára; Čížek, Lukáš; Řehounek, Jiří; Šebelíková, Lenka; Tropek, Robert; Lencová, Kamila; Bogusch, Petr; Marhoul, Pavel; Máca, Jan

    2016-07-01

    Open interior sands represent a highly threatened habitat in Europe. In recent times, their associated organisms have often found secondary refuges outside their natural habitats, mainly in sand pits. We investigated the effects of different restoration approaches, i.e. spontaneous succession without additional disturbances, spontaneous succession with additional disturbances caused by recreational activities, and forestry reclamation, on the diversity and conservation values of spiders, beetles, flies, bees and wasps, orthopterans and vascular plants in a large sand pit in the Czech Republic, Central Europe. Out of 406 species recorded in total, 112 were classified as open sand specialists and 71 as threatened. The sites restored through spontaneous succession with additional disturbances hosted the largest proportion of open sand specialists and threatened species. The forestry reclamations, in contrast, hosted few such species. The sites with spontaneous succession without disturbances represent a transition between these two approaches. While restoration through spontaneous succession favours biodiversity in contrast to forestry reclamation, additional disturbances are necessary to maintain early successional habitats essential for threatened species and open sand specialists. Therefore, recreational activities seem to be an economically efficient restoration tool that will also benefit biodiversity in sand pits. PMID:27053054

  4. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  5. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    PubMed

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS. PMID:27387996

  6. An analytical approach to the problem of inverse optimization with additive objective functions: an application to human prehension

    PubMed Central

    Pesin, Yakov B.; Niu, Xun; Latash, Mark L.

    2010-01-01

    We consider the problem of what is being optimized in human actions with respect to various aspects of human movements and different motor tasks. From the mathematical point of view this problem consists of finding an unknown objective function given the values at which it reaches its minimum. This problem is called the inverse optimization problem. Until now the main approach to this problems has been the cut-and-try method, which consists of introducing an objective function and checking how it reflects the experimental data. Using this approach, different objective functions have been proposed for the same motor action. In the current paper we focus on inverse optimization problems with additive objective functions and linear constraints. Such problems are typical in human movement science. The problem of muscle (or finger) force sharing is an example. For such problems we obtain sufficient conditions for uniqueness and propose a method for determining the objective functions. To illustrate our method we analyze the problem of force sharing among the fingers in a grasping task. We estimate the objective function from the experimental data and show that it can predict the force-sharing pattern for a vast range of external forces and torques applied to the grasped object. The resulting objective function is quadratic with essentially non-zero linear terms. PMID:19902213

  7. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    SciTech Connect

    Bjoerklund, Anna

    2012-01-15

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: Black-Right-Pointing-Pointer LCA was explored as analytical tool in an SEA process of municipal energy planning. Black-Right-Pointing-Pointer The process also integrated LCA with scenario planning and public participation. Black-Right-Pointing-Pointer Benefits of using LCA were a systematic framework and wider systems perspective. Black-Right-Pointing-Pointer Integration of tools required some methodological challenges to be solved. Black-Right-Pointing-Pointer This proved an innovative approach to define alternatives and scope of assessment.

  8. Cisapride a green analytical reagent for rapid and sensitive determination of bromate in drinking water, bread and flour additives by oxidative coupling spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Al Okab, Riyad Ahmed

    2013-02-01

    Green analytical methods using Cisapride (CPE) as green analytical reagent was investigated in this work. Rapid, simple, and sensitive spectrophotometric methods for the determination of bromate in water sample, bread and flour additives were developed. The proposed methods based on the oxidative coupling between phenoxazine and Cisapride in the presence of bromate to form red colored product with max at 520 nm. Phenoxazine and Cisapride and its reaction products were found to be environmentally friendly under the optimum experimental condition. The method obeys beers law in concentration range 0.11-4.00 g ml-1 and molar absorptivity 1.41 × 104 L mol-1 cm-1. All variables have been optimized and the presented reaction sequences were applied to the analysis of bromate in water, bread and flour additive samples. The performance of these method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference method. The combination of pharmaceutical drugs reagents with low concentration create some unique green chemical analyses.

  9. Cisapride a green analytical reagent for rapid and sensitive determination of bromate in drinking water, bread and flour additives by oxidative coupling spectrophotometric methods.

    PubMed

    Al Okab, Riyad Ahmed

    2013-02-15

    Green analytical methods using Cisapride (CPE) as green analytical reagent was investigated in this work. Rapid, simple, and sensitive spectrophotometric methods for the determination of bromate in water sample, bread and flour additives were developed. The proposed methods based on the oxidative coupling between phenoxazine and Cisapride in the presence of bromate to form red colored product with max at 520 nm. Phenoxazine and Cisapride and its reaction products were found to be environmentally friendly under the optimum experimental condition. The method obeys beers law in concentration range 0.11-4.00 g ml(-1) and molar absorptivity 1.41 × 10(4) L mol(-1)cm(-1). All variables have been optimized and the presented reaction sequences were applied to the analysis of bromate in water, bread and flour additive samples. The performance of these method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference method. The combination of pharmaceutical drugs reagents with low concentration create some unique green chemical analyses. PMID:23261631

  10. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  11. The modified ultrasound pattern sum score mUPSS as additional diagnostic tool for genetically distinct hereditary neuropathies.

    PubMed

    Grimm, Alexander; Rasenack, Maria; Athanasopoulou, Ioanna M; Dammeier, Nele Maria; Lipski, Christina; Wolking, Stefan; Vittore, Debora; Décard, Bernhard F; Axer, Hubertus

    2016-02-01

    The objective of this study is to evaluate the nerve ultrasound characteristics in genetically distinct inherited neuropathies, the value of the modified ultrasound pattern sum score (mUPSS) to differentiate between the subtypes and the correlation of ultrasound with nerve conduction studies (NCS), disease duration and severity. All patients underwent a standardized neurological examination, ultrasound, and NCS. In addition, genetic testing was performed. Consequently, mUPSS was applied, which is a sum-score of cross-sectional areas (CSA) at predefined anatomical points in different nerves. 31 patients were included (10xCharcot-Marie-Tooth (CMT)1a, 3xCMT1b, 3xCMTX, 9xCMT2, 6xHNPP [Hereditary neuropathy with liability to pressure palsies]). Generalized, homogeneous nerve enlargement and significantly increased UPS scores emphasized the diagnosis of demyelinating neuropathy, particularly CMT1a and CMT1b. The amount of enlargement did not depend on disease duration, symptom severity, height and weight. In CMTX the nerves were enlarged, as well, however, only in the roots and lower limbs, most prominent in men. In CMT2 no significant enlargement was detectable. In HNPP the CSA values were increased at entrapped sites, and not elsewhere. However, a distinction from CMT1, which also showed enlarged CSA values at entrapment sites, was only possible by calculating the entrapment ratios and entrapment score. The mUPSS allowed distinction between CMT1a (increased UPS scores, entrapment ratios <1.0) and HNPP (low UPS scores, entrapment ratios >1.4), while CMT1b and CMTX showed intermediate UPS types and entrapment ratios <1.0. Although based on few cases, ultrasound revealed consistent and homogeneous nerve alteration in certain inherited neuropathies. The modified UPSS is a quantitative tool, which may provide useful information for diagnosis, differentiation and follow-up evaluation in addition to NCS and molecular testing. PMID:26559821

  12. Urinary cortisol as an additional tool to assess the welfare of pregnant sows kept in two types of housing.

    PubMed

    Pol, Françoise; Courboulay, Valérie; Cotte, Jean-Pierre; Martrenchar, Arnaud; Hay, Magali; Mormède, Pierre

    2002-01-01

    The use of urinary cortisol (UC) as an additional tool to evaluate sows welfare was assessed in two experiments. In a preliminary methodological experiment, the kinetics of cortisol excretion in urine was studied during an Adreno Cortico Trophic Hormone (ACTH) challenge test in 10 pregnant sows. In a second experiment, 96 primiparous sows of an experimental unit were assigned to two different housing systems: 48 animals were housed in individual pens (IP) and 48 animals in collective pens (CP) with 6 animals per pen. UC was measured at the beginning and at the end of pregnancy and compared with other welfare indicators such as behaviour or skin damage. In both experiments, UC was measured using a high pressure liquid chromatography assay. In experiment 1, UC was constant on the day before injection of ACTH, with no variations related to circadian rhythm. It began to rise 2 h after the injection, peaked between 2 to 5 h after then returned to the basal concentration on the day after the injection. In experiment 2, UC concentrations were not different between CP- and IP-housed sows but they were higher in sows exhibiting the less stereotypies in comparison with sows exhibiting the most stereotypies. The results of this study suggest that UC is a good indicator of acute stress, more convenient than plasma cortisol measurement since it is a non-invasive method avoiding restraint or catheterisation of sows. They also suggest that UC could also give additional information on the assessment of chronic stress and improve the evaluation of animal welfare if used in conjunction with other welfare indicators. PMID:11873815

  13. A Serious Videogame as an Additional Therapy Tool for Training Emotional Regulation and Impulsivity Control in Severe Gambling Disorder

    PubMed Central

    Tárrega, Salomé; Castro-Carreras, Laia; Fernández-Aranda, Fernando; Granero, Roser; Giner-Bartolomé, Cristina; Aymamí, Neus; Gómez-Peña, Mónica; Santamaría, Juan J.; Forcano, Laura; Steward, Trevor; Menchón, José M.; Jiménez-Murcia, Susana

    2015-01-01

    Background: Gambling disorder (GD) is characterized by a significant lack of self-control and is associated with impulsivity-related personality traits. It is also linked to deficits in emotional regulation and frequently co-occurs with anxiety and depression symptoms. There is also evidence that emotional dysregulation may play a mediatory role between GD and psychopathological symptomatology. Few studies have reported the outcomes of psychological interventions that specifically address these underlying processes. Objectives: To assess the utility of the Playmancer platform, a serious video game, as an additional therapy tool in a CBT intervention for GD, and to estimate pre-post changes in measures of impulsivity, anger expression and psychopathological symptomatology. Method: The sample comprised a single group of 16 male treatment-seeking individuals with severe GD diagnosis. Therapy intervention consisted of 16 group weekly CBT sessions and, concurrently, 10 additional weekly sessions of a serious video game. Pre-post treatment scores on South Oaks Gambling Screen (SOGS), Barratt Impulsiveness Scale (BIS-11), I7 Impulsiveness Questionnaire (I7), State-Trait Anger Expression Inventory 2 (STAXI-2), Symptom Checklist-Revised (SCL-90-R), State-Trait Anxiety Inventory (STAI-S-T), and Novelty Seeking from the Temperament and Character Inventory-Revised (TCI-R) were compared. Results: After the intervention, significant changes were observed in several measures of impulsivity, anger expression and other psychopathological symptoms. Dropout and relapse rates during treatment were similar to those described in the literature for CBT. Conclusion: Complementing CBT interventions for GD with a specific therapy approach like a serious video game might be helpful in addressing certain underlying factors which are usually difficult to change, including impulsivity and anger expression. PMID:26617550

  14. Additive technology of soluble mold tooling for embedded devices in composite structures: A study on manufactured tolerances

    NASA Astrophysics Data System (ADS)

    Roy, Madhuparna

    Composite textiles have found widespread use and advantages in various industries and applications. The constant demand for high quality products and services requires companies to minimize their manufacturing costs, and delivery time in order to compete in general and niche marketplaces. Advanced manufacturing methods aim to provide economical methods of mold production. Creation of molding and tooling options for advanced composites encompasses a large portion of the fabrication time, making it a costly process and restraining factor. This research discusses a preliminary investigation into the use of soluble polymer compounds and additive manufacturing to fabricate soluble molds. These molds suffer from dimensional errors due to several factors, which have also been characterized. The basic soluble mold of a composite is 3D printed to meet the desired dimensions and geometry of holistic structures or spliced components. The time taken to dissolve the mold depends on the rate of agitation of the solvent. This process is steered towards enabling the implantation of optoelectronic devices within the composite to provide sensing capability for structural health monitoring. The shape deviation of the 3D printed mold is also studied and compared to its original dimensions to optimize the dimensional quality to produce dimensionally accurate parts. Mechanical tests were performed on compact tension (CT) resin samples prepared from these 3D printed molds and revealed crack propagation towards an embedded intact optical fiber.

  15. Demonstration of the Recent Additions in Modeling Capabilities for the WEC-Sim Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-03-01

    WEC-Sim is a mid-fidelity numerical tool for modeling wave energy conversion (WEC) devices. The code uses the MATLAB SimMechanics package to solve the multi-body dynamics and models the wave interactions using hydrodynamic coefficients derived from frequency domain boundary element methods. In this paper, the new modeling features introduced in the latest release of WEC-Sim will be presented. The first feature discussed is the conversion of the fluid memory kernel to a state-space approximation that provides significant gains in computational speed. The benefit of the state-space calculation becomes even greater after the hydrodynamic body-to-body coefficients are introduced as the number of interactions increases exponentially with the number of floating bodies. The final feature discussed is the capability toadd Morison elements to provide additional hydrodynamic damping and inertia. This is generally used as a tuning feature, because performance is highly dependent on the chosen coefficients. In this paper, a review of the hydrodynamic theory for each of the features is provided and successful implementation is verified using test cases.

  16. Analytical Enantioseparation of β-Substituted-2-Phenylpropionic Acids by High-Performance Liquid Chromatography with Hydroxypropyl-β-Cyclodextrin as Chiral Mobile Phase Additive.

    PubMed

    Tong, Shengqiang; Zhang, Hu; Yan, Jizhong

    2016-04-01

    Analytical enantioseparation of five β-substituted-2-phenylpropionic acids by high-performance liquid chromatography with hydroxypropyl-β-cyclodextrin (HP-β-CD) as chiral mobile phase additive was established in this paper, and chromatographic retention mechanism was studied. The effects of various factors such as the organic modifier, different ODS C18 columns and concentration of HP-β-CD were investigated. The chiral mobile phase was composed of methanol or acetonitrile and 0.5% triethylamine acetate buffer at pH 3.0 added with 25 mmol L(-1) of HP-β-CD, and baseline separations could be reached for all racemates. As for chromatographic retention mechanism, it was found that there was a negative correlation between the concentration of HP-β-CD in mobile phase and the retention factor under constant pH value and column temperature. PMID:26755500

  17. Building Adoption of Visual Analytics Software

    SciTech Connect

    Chinchor, Nancy; Cook, Kristin A.; Scholtz, Jean

    2012-01-05

    Adoption of technology is always difficult. Issues such as having the infrastructure necessary to support the technology, training for users, integrating the technology into current processes and tools, and having the time, managerial support, and necessary funds need to be addressed. In addition to these issues, the adoption of visual analytics tools presents specific challenges that need to be addressed. This paper discusses technology adoption challenges and approaches for visual analytics technologies.

  18. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What analytical...

  19. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What analytical...

  20. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  1. ANALYTICAL TOOL INTERFACE FOR LANDSCAPE ASSESSMENTS (ATIILA): AN ARCVIEW EXTENSION FOR THE ANALYSIS OF LANDSCAPE PATTERNS, COMPOSITION, AND STRUCTURE

    EPA Science Inventory

    Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...

  2. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  3. High-resolution continuum source electrothermal atomic absorption spectrometry — An analytical and diagnostic tool for trace analysis

    NASA Astrophysics Data System (ADS)

    Welz, Bernhard; Borges, Daniel L. G.; Lepri, Fábio G.; Vale, Maria Goreti R.; Heitmann, Uwe

    2007-09-01

    The literature about applications of high-resolution continuum source atomic absorption spectrometry (HR-CS AAS) with electrothermal atomization is reviewed. The historic development of HR-CS AAS is briefly summarized and the main advantages of this technique, mainly the 'visibility' of the spectral environment around the analytical line at high resolution and the unequaled simultaneous background correction are discussed. Simultaneous multielement CS AAS has been realized only in a very limited number of cases. The direct analysis of solid samples appears to have gained a lot from the special features of HR-CS AAS, and the examples from the literature suggest that calibration can be carried out against aqueous standards. Low-temperature losses of nickel and vanadyl porphyrins could be detected and avoided in the analysis of crude oil due to the superior background correction system. The visibility of the spectral environment around the analytical line revealed that the absorbance signal measured for phosphorus at the 213.6 nm non-resonance line without a modifier is mostly due to the PO molecule, and not to atomic phosphorus. The future possibility to apply high-resolution continuum source molecular absorption for the determination of non-metals is discussed.

  4. Extensions of the Johnson-Neyman Technique to Linear Models With Curvilinear Effects: Derivations and Analytical Tools.

    PubMed

    Miller, Jason W; Stromeyer, William R; Schwieterman, Matthew A

    2013-03-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way interactions in several types of linear models, this method has not been extended to include quadratic terms or more complicated models involving quadratic terms and interactions. Curvilinear relations of this type are incorporated in several theories in the social sciences. This article extends the J-N method to such linear models along with presenting freely available online tools that implement this technique as well as the traditional pick-a-point approach. Algebraic and graphical representations of the proposed J-N extension are provided. An example is presented to illustrate the use of these tools and the interpretation of findings. Issues of reliability as well as "spurious moderator" effects are discussed along with recommendations for future research. PMID:26741727

  5. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  6. Dynamic 3D visual analytic tools: a method for maintaining situational awareness during high tempo warfare or mass casualty operations

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.

    2010-04-01

    Maintaining Situational Awareness (SA) is crucial to the success of high tempo operations, such as war fighting and mass casualty events (bioterrorism, natural disasters). Modern computer and software applications attempt to provide command and control manager's situational awareness via the collection, integration, interrogation and display of vast amounts of analytic data in real-time from a multitude of data sources and formats [1]. At what point does the data volume and displays begin to erode the hierarchical distributive intelligence, command and control structure of the operation taking place? In many cases, people tasked with making decisions, have insufficient experience in SA of high tempo operations and become overwhelmed easily as vast amounts of data begin to be displayed in real-time as an operation unfolds. In these situations, where data is plentiful and the relevance of the data changes rapidly, there is a chance for individuals to target fixate on those data sources they are most familiar. If these individuals fall into this type of pitfall, they will exclude other data that might be just as important to the success of the operation. To counter these issues, it is important that the computer and software applications provide a means for prompting its users to take notice of adverse conditions or trends that are critical to the operation. This paper will discuss a new method of displaying data called a Crisis ViewTM, that monitors critical variables that are dynamically changing and allows preset thresholds to be created to prompt the user when decisions need to be made and when adverse or positive trends are detected. The new method will be explained in basic terms, with examples of its attributes and how it can be implemented.

  7. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals. PMID:27429366

  8. Spatial and Temporal Oxygen Dynamics in Macrofaunal Burrows in Sediments: A Review of Analytical Tools and Observational Evidence

    PubMed Central

    Satoh, Hisashi; Okabe, Satoshi

    2013-01-01

    The availability of benthic O2 plays a crucial role in benthic microbial communities and regulates many important biogeochemical processes. Burrowing activities of macrobenthos in the sediment significantly affect O2 distribution and its spatial and temporal dynamics in burrows, followed by alterations of sediment microbiology. Consequently, numerous research groups have investigated O2 dynamics in macrofaunal burrows. The introduction of powerful tools, such as microsensors and planar optodes, to sediment analysis has greatly enhanced our ability to measure O2 dynamics in burrows at high spatial and temporal resolution with minimal disturbance of the physical structure of the sediment. In this review, we summarize recent studies of O2-concentration measurements in burrows with O2 microsensors and O2 planar optodes. This manuscript mainly focuses on the fundamentals of O2 microsensors and O2 planar optodes, and their application in the direct measurement of the spatial and temporal dynamics of O2 concentrations in burrows, which have not previously been reviewed, and will be a useful supplement to recent literature reviews on O2 dynamics in macrofaunal burrows. PMID:23594972

  9. An emerging micro-scale immuno-analytical diagnostic tool to see the unseen. Holding promise for precision medicine and P4 medicine.

    PubMed

    Guzman, Norberto A; Guzman, Daniel E

    2016-05-15

    Over the years, analytical chemistry and immunology have contributed significantly to the field of clinical diagnosis by introducing quantitative techniques that can detect crucial and distinct chemical, biochemical and cellular biomarkers present in biosamples. Currently, quantitative two-dimensional hybrid immuno-analytical separation technologies are emerging as powerful tools for the sequential isolation, separation and detection of protein panels, including those with subtle structural changes such as variants, isoforms, peptide fragments, and post-translational modifications. One such technique to perform this challenging task is immunoaffinity capillary electrophoresis (IACE), which combines the use of antibodies and/or other affinity ligands as highly selective capture agents with the superior resolving power of capillary electrophoresis. Since affinity ligands can be polyreactive, i.e., binding and capturing more than one molecule, they may generate false positive results when tested under mono-dimensional procedures; one such application is enzyme-linked immunosorbent assay (ELISA). IACE, on the other hand, is a two-dimensional technique that captures (isolation and enrichment), releases, separates and detects (quantification, identification and characterization) a single or a panel of analytes from a sample, when coupled to one or more detectors simultaneously, without the presence of false positive or false negative data. This disruptive technique, capable of preconcentrate on-line results in enhanced sensitivity even in the analysis of complex matrices, may change the traditional system of testing biomarkers to obtain more accurate diagnosis of diseases, ideally before symptoms of a specific disease manifest. In this manuscript, we will present examples of the determination of biomarkers by IACE and the design of a miniaturized multi-dimensional IACE apparatus capable of improved sensitivity, specificity and throughput, with the potential of being used

  10. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  11. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. PMID:20424421

  12. Effect of Ti Addition on Carbide Modification and the Microscopic Simulation of Impact Toughness in High-Carbon Cr-V Tool Steels

    NASA Astrophysics Data System (ADS)

    Cho, Ki Sub; Kim, Sang Il; Park, Sung Soo; Choi, Won Suk; Moon, Hee Kwon; Kwon, Hoon

    2016-01-01

    In D7 tool steel, which contains high levels of primary carbides, the influence of carbide modification by Ti addition was quantitatively analyzed. Considering the Griffith-Irwin energy criterion for crack growth, the impact energy was evaluated by substituting a microscopic factor of the normalized number density of carbides cracked during hardness indentation tests for the crack length. The impact energy was enhanced with Ti addition because Ti reduced and refined the primary M7C3 carbide phase of elongated morphology, reducing the probability of crack generation.

  13. Challenges for Visual Analytics

    SciTech Connect

    Thomas, James J.; Kielman, Joseph

    2009-09-23

    Visual analytics has seen unprecedented growth in its first five years of mainstream existence. Great progress has been made in a short time, yet great challenges must be met in the next decade to provide new technologies that will be widely accepted by societies throughout the world. This paper sets the stage for some of those challenges in an effort to provide the stimulus for the research, both basic and applied, to address and exceed the envisioned potential for visual analytics technologies. We start with a brief summary of the initial challenges, followed by a discussion of the initial driving domains and applications, as well as additional applications and domains that have been a part of recent rapid expansion of visual analytics usage. We look at the common characteristics of several tools illustrating emerging visual analytics technologies, and conclude with the top ten challenges for the field of study. We encourage feedback and collaborative participation by members of the research community, the wide array of user communities, and private industry.

  14. Generalized additive models used to predict species abundance in the Gulf of Mexico: an ecosystem modeling tool.

    PubMed

    Drexler, Michael; Ainsworth, Cameron H

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  15. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    PubMed Central

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  16. Determination of Unknown Concentrations of Sodium Acetate Using the Method of Standard Addition and Proton NMR: An Experiment for the Undergraduate Analytical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Rajabzadeh, Massy

    2012-01-01

    In this experiment, students learn how to find the unknown concentration of sodium acetate using both the graphical treatment of standard addition and the standard addition equation. In the graphical treatment of standard addition, the peak area of the methyl peak in each of the sodium acetate standard solutions is found by integration using…

  17. Analytical tools in accelerator physics

    SciTech Connect

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  18. Evaluation of manometric temperature measurement (MTM), a process analytical technology tool in freeze drying, part III: heat and mass transfer measurement.

    PubMed

    Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J

    2006-01-01

    This article evaluates the procedures for determining the vial heat transfer coefficient and the extent of primary drying through manometric temperature measurement (MTM). The vial heat transfer coefficients (Kv) were calculated from the MTM-determined temperature and resistance and compared with Kv values determined by a gravimetric method. The differences between the MTM vial heat transfer coefficients and the gravimetric values are large at low shelf temperature but smaller when higher shelf temperatures were used. The differences also became smaller at higher chamber pressure and smaller when higher resistance materials were being freeze-dried. In all cases, using thermal shields greatly improved the accuracy of the MTM Kv measurement. With use of thermal shields, the thickness of the frozen layer calculated from MTM is in good agreement with values obtained gravimetrically. The heat transfer coefficient "error" is largely a direct result of the error in the dry layer resistance (ie, MTM-determined resistance is too low). This problem can be minimized if thermal shields are used for freeze-drying. With suitable use of thermal shields, accurate Kv values are obtained by MTM; thus allowing accurate calculations of heat and mass flow rates. The extent of primary drying can be monitored by real-time calculation of the amount of remaining ice using MTM data, thus providing a process analytical tool that greatly improves the freeze-drying process design and control. PMID:17285746

  19. Effect of Percent Relative Humidity, Moisture Content, and Compression Force on Light-Induced Fluorescence (LIF) Response as a Process Analytical Tool.

    PubMed

    Shah, Ishan G; Stagner, William C

    2016-08-01

    The effect of percent relative humidity (16-84% RH), moisture content (4.2-6.5% w/w MC), and compression force (4.9-44.1 kN CF) on the light-induced fluorescence (LIF) response of 10% w/w active pharmaceutical ingredient (API) compacts is reported. The fluorescent response was evaluated using two separate central composite designs of experiments. The effect of % RH and CF on the LIF signal was highly significant with an adjusted R (2)  = 0.9436 and p < 0.0001. Percent relative humidity (p = 0.0022), CF (p < 0.0001), and % RH(2) (p = 0.0237) were statistically significant factors affecting the LIF response. The effects of MC and CF on LIF response were also statistically significant with a p value <0.0001 and adjusted R (2) value of 0.9874. The LIF response was highly impacted by MC (p < 0.0001), CF (p < 0.0001), and MC(2) (p = 0022). At 10% w/w API, increased % RH, MC, and CF led to a nonlinear decrease in LIF response. The derived quadratic model equations explained more than 94% of the data. Awareness of these effects on LIF response is critical when implementing LIF as a process analytical tool. PMID:27435199

  20. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    SciTech Connect

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  1. Development and validation of a dynamic range-extended LC-MS/MS multi-analyte method for 11 different postmortem matrices for redistribution studies applying solvent calibration and additional (13)C isotope monitoring.

    PubMed

    Staeheli, Sandra N; Poetzsch, Michael; Kraemer, Thomas; Steuer, Andrea E

    2015-11-01

    Postmortem redistribution (PMR) is one of numerous problems in postmortem toxicology making correct interpretation of measured drug concentrations difficult or even impossible. Time-dependent PMR in peripheral blood and especially in tissue samples is still under-explored. For further investigation, an easy applicable method for the simultaneous quantitation of over 80 forensically relevant compounds in 11 different postmortem matrices should be developed and validated overcoming the challenges of high inter-matrix and intra-matrix concentration variances. Biopsy samples (20 mg) or body fluids (20 μL) were spiked with an analyte mix and deuterated internal standards, extracted by liquid-liquid extraction, and analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS). For highest applicability, an easy solvent calibration was used. Furthermore, time-consuming dilution of high concentration samples showing detector saturation was circumvented by two overlapping calibration curves using (12)C isotope monitoring for low concentrations and (13)C isotopes for high concentration, respectively. The method was validated according to international guidelines with modifications. Matrix effects and extraction efficiency were strongly matrix and analyte dependent. In general, brain and adipose tissue produced the highest matrix effects, whereas cerebrospinal fluid showed the least matrix effects. Accuracy and precision results were rather matrix independent with some exceptions. Despite using an external solvent calibration, the accuracy requirements were fulfilled for 66 to 81 % of the 83 analytes. Depending on the matrix, 75-93 % of the analytes showed intra-day precisions at <20 %. (12)C and (13)C calibrations gave comparable results and proved to be a useful tool in expanding the dynamic range. PMID:26396081

  2. [A need to implement new tools for diagnosing tobacco-addition syndrome and readiness/motivation to quit smoking in the working-age population in Poland].

    PubMed

    Broszkiewicz, Marzenna; Drygas, Wojciech

    2016-01-01

    High rates of tobacco use is still observed in working-age population in Poland. The present level of the state tobacco control has been achieved through adopting legal regulations and population-based interventions. In Poland a sufficient contribution of health professionals to the diagnosis of the tobacco-addition syndrome (TAS) and the application of the 5A's (ask, advice, assess, assist, arrange follow-up) brief intervention, has not been confirmed by explicit research results. Systemic solutions of the health care system of the professional control, specialist health care, health professional trainings and reference centres have not as yet been elaborated. The tools for diagnosing tobacco dependence and motivation to quit smoking, developed over 30 years ago and recommended by experts to be used in clinical and research practice, have not met the current addiction criteria. In this paper other tools than those previously recommended - tests developed in the first decade of the 21st century (including Cigarette Dependence Scale and Nicotine Dependence Syndrome Scale), reflecting modern concepts of nicotine dependence are presented. In the literature on the readiness/motivation to change health behaviors, a new approach dominates. The motivational interviewing (MI) by Miller and Rollnick concentrates on a smoking person and his or her internal motivation. Motivational interviewing is recommended by the World Health Organization as a 5R's (relevance, risks, rewards, roadblocks, repetition) brief motivational advice, addressed to tobacco users who are unwilling to make a quit attempt. In Poland new research studies on the implementation of new diagnostic tools and updating of binding guidelines should be undertaken, to strengthen primary health care in treating tobacco dependence, and to incorporate MI and 5R's into trainings in TAS diagnosing and treating addressed to health professionals. PMID:27044722

  3. Testing microtaphofacies as an analytic tool for integrated facies and sedimentological analysis using Lower Miocene mixed carbonate/siliciclastic sediments from the North Alpine Foreland Basin

    NASA Astrophysics Data System (ADS)

    Nebelsick, James; Bieg, Ulrich

    2010-05-01

    Taphonomic studies have mostly concentrated on the investigation and quantification of isolated macroscopic faunal and floral elements. Carbonate rocks, in contrary to isolated macroscopic objects, have rarely been specifically addressed in terms of taphonomic features, although many aspects of microfacies analyses are directly related to the preservation of constituent biogenic components. There is thus a high potential for analyzing and quantifying taphonomic features in carbonate rocks (microtaphofacies), not the least as an additional tool for facies analysis. Analyzing the role of taphonomy in carbonate environments can be used to determine how different skeletal architectures through time and evolving synecological relationships (bioerosion and encrustation) have influence carbonate environments and their preservation in the rock record. This pilot study analyses the microtaphofacies of Lower Miocene, shallow water, mixed carbonate - siliciclastic environment from the North Alpine Foreland Basin (Molasse Sea) of southern Germany. The sediments range from biogenic bryomol carbonates to pure siliciclastics. This allows environmental interpretation to be made not only with respect to biogenic composition (dominated by bivalves, gastropods, bryozoans and barnacles), but also to siliciclastic grain characteristics and sedimentary features. Facies interpretation is relatively straight forward with a somewhat varied near shore facies distribution characterized dominated by carbonate which grade into higher energy, siliciclastic offshore sediments. Taphonomic features are assessed along this gradient with respect to total component composition as well as by following the trajectories of individual components types. The results are interpreted with respect to biogenic production, fragmentation, abrasion and transport.

  4. A novel ion-pairing chromatographic method for the simultaneous determination of both nicarbazin components in feed additives: chemometric tools for improving the optimization and validation.

    PubMed

    De Zan, María M; Teglia, Carla M; Robles, Juan C; Goicoechea, Héctor C

    2011-07-15

    The development, optimization and validation of an ion-pairing high performance liquid chromatography method for the simultaneous determination of both nicarbazin (NIC) components: 4,4'-dinitrocarbanilide (DNC) and 2-hydroxy-4,6-dimethylpyrimidine (HDP) in bulk materials and feed additives are described. An experimental design was used for the optimization of the chromatographic system. Four variables, including mobile phase composition and oven temperature, were analyzed through a central composite design exploring their contribution to analyte separation. Five responses: peak resolutions, HDP capacity factor, HDP tailing and analysis time, were modelled by using the response surface methodology and were optimized simultaneously by implementing the desirability function. The optimum conditions resulted in a mobile phase consisting of 10.0 mmol L(-1) of 1-heptanesulfonate, 20.0 mmol L(-1) of sodium acetate, pH=3.30 buffer and acetonitrile in a gradient system at a flow rate of 1.00 mL min(-1). Column was an INERSTIL ODS-3 (4.6 mm×150 mm, 5 μm particle size) at 40.0°C. Detection was performed at 300 nm by a diode array detector. The validation results of the method indicated a high selectivity and good precision characteristics, with RSD less than 1.0% for both components, both in intra and inter-assay precision studies. Linearity was proved for a range of 32.0-50.0 μg mL(-1) of NIC in sample solution. The recovery, studied at three different fortification levels, varied from 98.0 to 101.4 for HDP and from 99.1 to 100.2 for DNC. The applicability of the method was demonstrated by determining DNC and HDP content in raw materials and commercial formulations used for coccidiosis prevention. Assays results on real samples showed that considerable differences in molecular ratio DNC:HDP exist among them. PMID:21645683

  5. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  6. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  7. A concentration dependent auto-relay-recognition by the same analyte: a dual fluorescence switch-on by hydrogen sulfide via Michael addition followed by reduction and staining for bio-activity.

    PubMed

    Das, Avijit Kumar; Goswami, Shyamaprosad; Dutta, Gorachand; Maity, Sibaprasad; Mandal, Tarun kanti; Khanra, Kalyani; Bhattacharyya, Nandan

    2016-01-14

    H2S is shown, for the first time, to play an extraordinary dual role due to its nucleophilicity and reducing property with our single chemosensor, PND [4-(piperidin-1-yl) naphthalene-1,2-dione]. The initial nucleophilic attack via Michael addition (a lower concentration of H2S, blue fluorescence) is followed by the reduction of the 1,2-diketo functionality (a higher concentration of H2S, green fluorescence). This chemosensor, which also shows biological response, is remarkably effective in sensing the same analyte (H2S) at its different concentrations in a relay pathway via a fluorescence "off-on-on" mechanism, and this is also supported by DFT calculation and Cyclic voltammograms. PMID:26510406

  8. Visual Analytics Technology Transition Progress

    SciTech Connect

    Scholtz, Jean; Cook, Kristin A.; Whiting, Mark A.; Lemon, Douglas K.; Greenblatt, Howard

    2009-09-23

    The authors provide a description of the transition process for visual analytic tools and contrast this with the transition process for more traditional software tools. This paper takes this into account and describes a user-oriented approach to technology transition including a discussion of key factors that should be considered and adapted to each situation. The progress made in transitioning visual analytic tools in the past five years is described and the challenges that remain are enumerated.

  9. Phase II Fort Ord Landfill Demonstration Task 8 - Refinement of In-line Instrumental Analytical Tools to Evaluate their Operational Utility and Regulatory Acceptance

    SciTech Connect

    Daley, P F

    2006-04-03

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water sampling and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection

  10. Version 1.00 programmer`s tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    SciTech Connect

    Femec, D.A.

    1995-09-01

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.

  11. Evaluating analytic and risk assessment tools to estimate sediment and nutrients losses from agricultural lands in the southern region of the USA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Non-point source pollution from agricultural fields is a critical problem associated with water quality impairment in the USA and a low-oxygen environment in the Gulf of Mexico. The use, development and enhancement of qualitative and quantitative models or tools for assessing agricultural runoff qua...

  12. Comparison of high-resolution ultrasonic resonator technology and Raman spectroscopy as novel process analytical tools for drug quantification in self-emulsifying drug delivery systems.

    PubMed

    Stillhart, Cordula; Kuentz, Martin

    2012-02-01

    Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. PMID:22079118

  13. Comparison of Analytic and Numerical Models With Commercially Available Simulation Tools for the Prediction of Semiconductor Freeze-Out and Exhaustion

    NASA Astrophysics Data System (ADS)

    Reeves, Derek E.

    2002-09-01

    This thesis reports on three procedures and the associated numerical results for obtaining semiconductor majority carrier concentrations when subjected to a temperature sweep. The capability of predicting the exhaustion regime boundaries of a semiconductor is critical in understanding and exploiting the full potential of the modern integrated circuit. An efficient and reliable method is needed to accomplish this task. Silvaco International's semiconductor simulation software was used to predict temperature dependent majority carrier concentration for a semiconductor cell. Comparisons with analytical and numerical MATLAB-based schemes were made. This was done for both Silicon and GaAs materials. Conditions of the simulations demonstrated effect known as Bandgap Narrowing.

  14. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for quantitation of Benazepril alone and in combination with Amlodipine.

    PubMed

    Farouk, M; Elaziz, Omar Abd; Tawakkol, Shereen M; Hemdan, A; Shehata, Mostafa A

    2014-04-01

    Four simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the determination of Benazepril (BENZ) alone and in combination with Amlodipine (AML) in pharmaceutical dosage form. The first method is pH induced difference spectrophotometry, where BENZ can be measured in presence of AML as it showed maximum absorption at 237nm and 241nm in 0.1N HCl and 0.1N NaOH, respectively, while AML has no wavelength shift in both solvents. The second method is the new Extended Ratio Subtraction Method (EXRSM) coupled to Ratio Subtraction Method (RSM) for determination of both drugs in commercial dosage form. The third and fourth methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 2-30μg/mL for BENZ in difference and extended ratio subtraction spectrophotometric method, and 5-30 for AML in EXRSM method, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits. PMID:24424258

  15. The color of complexes and UV-vis spectroscopy as an analytical tool of Alfred Werner's group at the University of Zurich.

    PubMed

    Fox, Thomas; Berke, Heinz

    2014-01-01

    Two PhD theses (Alexander Gordienko, 1912; Johannes Angerstein, 1914) and a dissertation in partial fulfillment of a PhD thesis (H. S. French, Zurich, 1914) are reviewed that deal with hitherto unpublished UV-vis spectroscopy work of coordination compounds in the group of Alfred Werner. The method of measurement of UV-vis spectra at Alfred Werner's time is described in detail. Examples of spectra of complexes are given, which were partly interpreted in terms of structure (cis ↔ trans configuration, counting number of bands for structural relationships, and shift of general spectral features by consecutive replacement of ligands). A more complete interpretation of spectra was hampered at Alfred Werner's time by the lack of a light absorption theory and a correct theory of electron excitation, and the lack of a ligand field theory for coordination compounds. The experimentally difficult data acquisitions and the difficult spectral interpretations might have been reasons why this method did not experience a breakthrough in Alfred Werner's group to play a more prominent role as an important analytical method. Nevertheless the application of UV-vis spectroscopy on coordination compounds was unique and novel, and witnesses Alfred Werner's great aptitude and keenness to always try and go beyond conventional practice. PMID:24983805

  16. An analytical framework and tool ('InteRa') for integrating the informal recycling sector in waste and resource management systems in developing countries.

    PubMed

    Velis, Costas A; Wilson, David C; Rocca, Ondina; Smith, Stephen R; Mavropoulos, Antonis; Cheeseman, Chris R

    2012-09-01

    In low- and middle-income developing countries, the informal (collection and) recycling sector (here abbreviated IRS) is an important, but often unrecognised, part of a city's solid waste and resources management system. Recent evidence shows recycling rates of 20-30% achieved by IRS systems, reducing collection and disposal costs. They play a vital role in the value chain by reprocessing waste into secondary raw materials, providing a livelihood to around 0.5% of urban populations. However, persisting factual and perceived problems are associated with IRS (waste-picking): occupational and public health and safety (H&S), child labour, uncontrolled pollution, untaxed activities, crime and political collusion. Increasingly, incorporating IRS as a legitimate stakeholder and functional part of solid waste management (SWM) is attempted, further building recycling rates in an affordable way while also addressing the negatives. Based on a literature review and a practitioner's workshop, here we develop a systematic framework--or typology--for classifying and analysing possible interventions to promote the integration of IRS in a city's SWM system. Three primary interfaces are identified: between the IRS and the SWM system, the materials and value chain, and society as a whole; underlain by a fourth, which is focused on organisation and empowerment. To maximise the potential for success, IRS integration/inclusion/formalisation initiatives should consider all four categories in a balanced way and pay increased attention to their interdependencies, which are central to success, including specific actions, such as the IRS having access to source separated waste. A novel rapid evaluation and visualisation tool is presented--integration radar (diagram) or InterRa--aimed at illustrating the degree to which a planned or existing intervention considers each of the four categories. The tool is further demonstrated by application to 10 cases around the world, including a step

  17. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  18. Analytical testing

    NASA Technical Reports Server (NTRS)

    Flannelly, W. G.; Fabunmi, J. A.; Nagy, E. J.

    1981-01-01

    Analytical methods for combining flight acceleration and strain data with shake test mobility data to predict the effects of structural changes on flight vibrations and strains are presented. This integration of structural dynamic analysis with flight performance is referred to as analytical testing. The objective of this methodology is to analytically estimate the results of flight testing contemplated structural changes with minimum flying and change trials. The category of changes to the aircraft includes mass, stiffness, absorbers, isolators, and active suppressors. Examples of applying the analytical testing methodology using flight test and shake test data measured on an AH-1G helicopter are included. The techniques and procedures for vibration testing and modal analysis are also described.

  19. Analytical tools for identification of non-intentionally added substances (NIAS) coming from polyurethane adhesives in multilayer packaging materials and their migration into food simulants.

    PubMed

    Félix, Juliana S; Isella, Francesca; Bosetti, Osvaldo; Nerín, Cristina

    2012-07-01

    Adhesives used in food packaging to glue different materials can provide several substances as potential migrants, and the identification of potential migrants and migration tests are required to assess safety in the use of adhesives. Solid-phase microextraction in headspace mode and gas chromatography coupled to mass spectrometry (HS-SPME-GC-MS) and ChemSpider and SciFinder databases were used as powerful tools to identify the potential migrants in the polyurethane (PU) adhesives and also in the individual plastic films (polyethylene terephthalate, polyamide, polypropylene, polyethylene, and polyethylene/ethyl vinyl alcohol). Migration tests were carried out by using Tenax(®) and isooctane as food simulants, and the migrants were analyzed by gas chromatography coupled to mass spectrometry. More than 63 volatile and semivolatile compounds considered as potential migrants were detected either in the adhesives or in the films. Migration tests showed two non-intentionally added substances (NIAS) coming from PU adhesives that migrated through the laminates into Tenax(®) and into isooctane. Identification of these NIAS was achieved through their mass spectra, and 1,6-dioxacyclododecane-7,12-dione and 1,4,7-trioxacyclotridecane-8,13-dione were confirmed. Caprolactam migrated into isooctane, and its origin was the external plastic film in the multilayer, demonstrating real diffusion through the multilayer structure. Comparison of the migration values between the simulants and conditions will be shown and discussed. PMID:22526644

  20. The cryogenic on-orbit liquid analytical tool (COOLANT) - A computer program for evaluating the thermodynamic performance of orbital cryogen storage facilities

    NASA Technical Reports Server (NTRS)

    Taylor, W. J.; Honkonen, S. C.; Williams, G. E.; Liggett, M. W.; Tucker, S. P.

    1991-01-01

    The United States plans to establish a permanent manned presence at the Space Station Freedom in low earth orbit (LEO) and then carry out exploration of the solar system from this base. These plans may require orbital cryogenic propellant storage depots. The COOLANT program has been developed to analyze the thermodynamic performance of these depots to support design tradeoff studies. It was developed as part of the Long Term Cryogenic Storage Facility Systems Study for NASA/MSFC. This paper discusses the program structure and capabilities of the COOLANT program. In addition, the results of an analysis of a 200,000 lbm hydrogen/oxygen storage depot tankset using COOLANT are presented.

  1. Number series of atoms, interatomic bonds and interface bonds defining zinc-blende nanocrystals as function of size, shape and surface orientation: Analytic tools to interpret solid state spectroscopy data

    NASA Astrophysics Data System (ADS)

    König, Dirk

    2016-08-01

    Semiconductor nanocrystals (NCs) experience stress and charge transfer by embedding materials or ligands and impurity atoms. In return, the environment of NCs experiences a NC stress response which may lead to matrix deformation and propagated strain. Up to now, there is no universal gauge to evaluate the stress impact on NCs and their response as a function of NC size dNC. I deduce geometrical number series as analytical tools to obtain the number of NC atoms NNC(dNC[i]), bonds between NC atoms Nbnd(dNC[i]) and interface bonds NIF(dNC[i]) for seven high symmetry zinc-blende (zb) NCs with low-index faceting: {001} cubes, {111} octahedra, {110} dodecahedra, {001}-{111} pyramids, {111} tetrahedra, {111}-{001} quatrodecahedra and {001}-{111} quadrodecahedra. The fundamental insights into NC structures revealed here allow for major advancements in data interpretation and understanding of zb- and diamond-lattice based nanomaterials. The analytical number series can serve as a standard procedure for stress evaluation in solid state spectroscopy due to their deterministic nature, easy use and general applicability over a wide range of spectroscopy methods as well as NC sizes, forms and materials.

  2. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  3. Quantification of individual phenolic compounds' contribution to antioxidant capacity in apple: a novel analytical tool based on liquid chromatography with diode array, electrochemical, and charged aerosol detection.

    PubMed

    Plaza, Merichel; Kariuki, James; Turner, Charlotta

    2014-01-15

    Phenolics, particularly from apples, hold great interest because of their antioxidant properties. In the present study, the total antioxidant capacity of different apple extracts obtained by pressurized hot water extraction (PHWE) was determined by cyclic voltammetry (CV), which was compared with the conventional antioxidant assays. To measure the antioxidant capacity of individual antioxidants present in apple extracts, a novel method was developed based on high-performance liquid chromatography (HPLC) with photodiode array (DAD), electrochemical (ECD), and charged aerosol (CAD) detection. HPLC-DAD-ECD-CAD enabled rapid, qualitative, and quantitative determination of antioxidants in the apple extracts. The main advantage of using CAD was that this detector enabled quantification of a large number of phenolics using only a few standards. The results showed that phenolic acids and flavonols were mainly responsible for the total antioxidant capacity of apple extracts. In addition, protocatechuic acid, chlorogenic acid, hyperoside, an unidentified phenolic acid, and a quercetin derivative presented the highest antioxidant capacities. PMID:24345041

  4. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    PubMed

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively. PMID:23727675

  5. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  6. Analytical Searching.

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    1995-01-01

    Discusses analytical searching, a process that enables searchers of electronic resources to develop a planned strategy by combining words or phrases with Boolean operators. Defines simple and complex searching, and describes search strategies developed with Boolean logic and truncation. Provides guidelines for teaching students analytical…

  7. Determining the Efficacy of Magnetic Susceptibility as an Analytical Tool in the Middle Devonian Gas Bearing Shale of Taylor County, West Virginia

    NASA Astrophysics Data System (ADS)

    Baird, John

    . The accuracy of the magnetic susceptibility of the whole rock was within the same order of magnitude as the other methods, and the accuracy of the magnetic susceptibility of the isolated kerogen component was an order of magnitude higher. In addition, evidence was found that links the magnetic susceptibility of kerogen within the two units to the composition of the kerogen. Vitrinite reflectance data confirms that variations in the magnetic susceptibility of the kerogen was not caused by variations in maturity. A very strong logarithmic relationship was found between the magnetic susceptibility of kerogen and the weight percent present. Using the hypothesis that variations in the amount of organic material present is linked to episodic algal blooms, it was concluded that the organic material supplied by these blooms significantly lowered the magnetic susceptibility of the organic sediment supplied during the normal habitat of the basin.

  8. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-01-01

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed. PMID:26076112

  9. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    PubMed

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. PMID:27271998

  10. A noninvasive biomechanical treatment as an additional tool in the rehabilitation of an acute anterior cruciate ligament tear: A case report

    PubMed Central

    Elbaz, Avi; Cohen, Marc S; Debbi, Eytan M; Rath, Udi; Mor, Amit; Morag, Guy; Beer, Yiftah; Segal, Ganit

    2014-01-01

    Objectives: Conservative treatments for anterior cruciate ligament (ACL) tears may have just as good an outcome as invasive treatments. These include muscle strengthening and neuromuscular proprioceptive exercises to improve joint stability and restore motion to the knee. The Purpose of the current work presents was to examine the feasibility of a novel non-invasive biomechanical treatment to improve the rehabilitation process following an ACL tear. This is a single case report that presents the effect of this therapy in a patient with a complete ACL rupture who chose not to undergo reconstructive surgery. Methods: A 29-year old female athlete with an acute indirect injury to the knee who chose not to undergo surgery was monitored. Two days after injury the patient began AposTherapy. A unique biomechanical device was specially calibrated to the patient’s feet. The therapy program was initiated, which included carrying out her daily routine while wearing the device. The subject underwent a gait analysis at baseline and follow-up gait analyses at weeks 1, 2, 4, 8, 12 and 26. Results: A severe abnormal gait was seen immediately after injury, including a substantial decrease in gait velocity, step length and single limb support. In addition, limb symmetry was substantially compromised following the injury. After 4 weeks of treatment, patient had returned to normal gait values and limbs asymmetry reached the normal range. Conclusions: The results of this case report suggest that this conservative biomechanical therapy may have helped this patient in her rehabilitation process. Further research is needed in order to determine the effect of this therapy for patients post ACL injuries. PMID:27489638

  11. An Eight-Eyed Version of Hawkins and Shohet's Clinical Supervision Model: The Addition of the Cognitive Analytic Therapy Concept of the "Observing Eye/I" as the "Observing Us"

    ERIC Educational Resources Information Center

    Darongkamas, Jurai; John, Christopher; Walker, Mark James

    2014-01-01

    This paper proposes incorporating the concept of the "observing eye/I", from cognitive analytic therapy (CAT), to Hawkins and Shohet's seven modes of supervision, comprising their transtheoretical model of supervision. Each mode is described alongside explicit examples relating to CAT. This modification using a key idea from CAT (in…

  12. Analytical sedimentology

    SciTech Connect

    Lewis, D.W. . Dept. of Geology); McConchie, D.M. . Centre for Coastal Management)

    1994-01-01

    Both a self instruction manual and a cookbook'' guide to field and laboratory analytical procedures, this book provides an essential reference for non-specialists. With a minimum of mathematics and virtually no theory, it introduces practitioners to easy, inexpensive options for sample collection and preparation, data acquisition, analytic protocols, result interpretation and verification techniques. This step-by-step guide considers the advantages and limitations of different procedures, discusses safety and troubleshooting, and explains support skills like mapping, photography and report writing. It also offers managers, off-site engineers and others using sediments data a quick course in commissioning studies and making the most of the reports. This manual will answer the growing needs of practitioners in the field, either alone or accompanied by Practical Sedimentology, which surveys the science of sedimentology and provides a basic overview of the principles behind the applications.

  13. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  14. Nanomaterials as Analytical Tools for Genosensors

    PubMed Central

    Abu-Salah, Khalid M.; Alrokyan, Salman A.; Khan, Muhammad Naziruddin; Ansari, Anees Ahmad

    2010-01-01

    Nanomaterials are being increasingly used for the development of electrochemical DNA biosensors, due to the unique electrocatalytic properties found in nanoscale materials. They offer excellent prospects for interfacing biological recognition events with electronic signal transduction and for designing a new generation of bioelectronic devices exhibiting novel functions. In particular, nanomaterials such as noble metal nanoparticles (Au, Pt), carbon nanotubes (CNTs), magnetic nanoparticles, quantum dots and metal oxide nanoparticles have been actively investigated for their applications in DNA biosensors, which have become a new interdisciplinary frontier between biological detection and material science. In this article, we address some of the main advances in this field over the past few years, discussing the issues and challenges with the aim of stimulating a broader interest in developing nanomaterial-based biosensors and improving their applications in disease diagnosis and food safety examination. PMID:22315580

  15. GRIPPING TOOL

    DOEpatents

    Sandrock, R.J.

    1961-12-12

    A self-actuated gripping tool is described for transferring fuel elements and the like into reactors and other inaccessible locations. The tool will grasp or release the load only when properly positioned for this purpose. In addition, the load cannot be released except when unsupported by the tool, so that jarring or contact will not bring about accidental release of the load. The gripping members or jaws of the device are cam-actuated by an axially slidable shaft which has two lockable positions. A spring urges the shaft into one position and a solenoid is provided to overcome the spring and move it into the other position. The weight of the tool operates a sleeve to lock the shaft in its existing position. Only when the cable supporting the tool is slack is the device capable of being actuated either to grasp or release its load. (AEC)

  16. The Science of Analytic Reporting

    SciTech Connect

    Chinchor, Nancy; Pike, William A.

    2009-09-23

    The challenge of visually communicating analysis results is central to the ability of visual analytics tools to support decision making and knowledge construction. The benefit of emerging visual methods will be improved through more effective exchange of the insights generated through the use of visual analytics. This paper outlines the major requirements for next-generation reporting systems in terms of eight major research needs: the development of best practices, design automation, visual rhetoric, context and audience, connecting analysis to presentation, evidence and argument, collaborative environments, and interactive and dynamic documents. It also describes an emerging technology called Active Products that introduces new techniques for analytic process capture and dissemination.

  17. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated. PMID:16447373

  18. Food additives

    MedlinePlus

    Food additives are substances that become part of a food product when they are added during the processing or making of that food. "Direct" food additives are often added during processing to: Add nutrients ...

  19. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  20. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  1. University Macro Analytic Simulation Model.

    ERIC Educational Resources Information Center

    Baron, Robert; Gulko, Warren

    The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…

  2. Food additives.

    PubMed

    Berglund, F

    1978-01-01

    The use of additives to food fulfils many purposes, as shown by the index issued by the Codex Committee on Food Additives: Acids, bases and salts; Preservatives, Antioxidants and antioxidant synergists; Anticaking agents; Colours; Emulfifiers; Thickening agents; Flour-treatment agents; Extraction solvents; Carrier solvents; Flavours (synthetic); Flavour enhancers; Non-nutritive sweeteners; Processing aids; Enzyme preparations. Many additives occur naturally in foods, but this does not exclude toxicity at higher levels. Some food additives are nutrients, or even essential nutritents, e.g. NaCl. Examples are known of food additives causing toxicity in man even when used according to regulations, e.g. cobalt in beer. In other instances, poisoning has been due to carry-over, e.g. by nitrate in cheese whey - when used for artificial feed for infants. Poisonings also occur as the result of the permitted substance being added at too high levels, by accident or carelessness, e.g. nitrite in fish. Finally, there are examples of hypersensitivity to food additives, e.g. to tartrazine and other food colours. The toxicological evaluation, based on animal feeding studies, may be complicated by impurities, e.g. orthotoluene-sulfonamide in saccharin; by transformation or disappearance of the additive in food processing in storage, e.g. bisulfite in raisins; by reaction products with food constituents, e.g. formation of ethylurethane from diethyl pyrocarbonate; by metabolic transformation products, e.g. formation in the gut of cyclohexylamine from cyclamate. Metabolic end products may differ in experimental animals and in man: guanylic acid and inosinic acid are metabolized to allantoin in the rat but to uric acid in man. The magnitude of the safety margin in man of the Acceptable Daily Intake (ADI) is not identical to the "safety factor" used when calculating the ADI. The symptoms of Chinese Restaurant Syndrome, although not hazardous, furthermore illustrate that the whole ADI

  3. Collaborative Analytical Toolbox version 1.0

    2008-08-21

    The purpose of the Collaborative Analytical Toolbox (CAT) is to provide a comprehensive, enabling, collaborative problem solving environment that enables users to more effectively apply and improve their analytical and problem solving capabilities. CAT is a software framework for integrating other tools and data sources. It includes a set of core services for collaboration and information exploration and analysis, and a framework that facilitates quickly integrating new ideas, techniques, and tools with existing data sources.

  4. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  5. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  6. Phosphazene additives

    SciTech Connect

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  7. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  8. Analytical Chemistry of Nitric Oxide

    PubMed Central

    Hetrick, Evan M.

    2013-01-01

    Nitric oxide (NO) is the focus of intense research, owing primarily to its wide-ranging biological and physiological actions. A requirement for understanding its origin, activity, and regulation is the need for accurate and precise measurement techniques. Unfortunately, analytical assays for monitoring NO are challenged by NO’s unique chemical and physical properties, including its reactivity, rapid diffusion, and short half-life. Moreover, NO concentrations may span pM to µM in physiological milieu, requiring techniques with wide dynamic response ranges. Despite such challenges, many analytical techniques have emerged for the detection of NO. Herein, we review the most common spectroscopic and electrochemical methods, with special focus on the fundamentals behind each technique and approaches that have been coupled with modern analytical measurement tools or exploited to create novel NO sensors. PMID:20636069

  9. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  10. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  11. An Analysis of Earth Science Data Analytics Use Cases

    NASA Astrophysics Data System (ADS)

    Shie, C. L.; Kempler, S. J.

    2015-12-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https://earthdata.nasa.gov/about/system-performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co-analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  12. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  13. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  14. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  15. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  16. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. PMID:23978903

  17. Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics

    PubMed Central

    Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce

    2013-01-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328

  18. A review of the analytical simulation of aircraft crash dynamics

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Carden, Huey D.; Boitnott, Richard L.; Hayduk, Robert J.

    1990-01-01

    A large number of full scale tests of general aviation aircraft, helicopters, and one unique air-to-ground controlled impact of a transport aircraft were performed. Additionally, research was also conducted on seat dynamic performance, load-limiting seats, load limiting subfloor designs, and emergency-locator-transmitters (ELTs). Computer programs were developed to provide designers with methods for predicting accelerations, velocities, and displacements of collapsing structure and for estimating the human response to crash loads. The results of full scale aircraft and component tests were used to verify and guide the development of analytical simulation tools and to demonstrate impact load attenuating concepts. Analytical simulation of metal and composite aircraft crash dynamics are addressed. Finite element models are examined to determine their degree of corroboration by experimental data and to reveal deficiencies requiring further development.

  19. Analytical sensor redundancy assessment

    NASA Technical Reports Server (NTRS)

    Mulcare, D. B.; Downing, L. E.; Smith, M. K.

    1988-01-01

    The rationale and mechanization of sensor fault tolerance based on analytical redundancy principles are described. The concept involves the substitution of software procedures, such as an observer algorithm, to supplant additional hardware components. The observer synthesizes values of sensor states in lieu of their direct measurement. Such information can then be used, for example, to determine which of two disagreeing sensors is more correct, thus enhancing sensor fault survivability. Here a stability augmentation system is used as an example application, with required modifications being made to a quadruplex digital flight control system. The impact on software structure and the resultant revalidation effort are illustrated as well. Also, the use of an observer algorithm for wind gust filtering of the angle-of-attack sensor signal is presented.

  20. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    SciTech Connect

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.; Christel, Michael; Ribarsky, Martin W.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  1. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  2. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  3. Analytics for Metabolic Engineering

    PubMed Central

    Petzold, Christopher J.; Chan, Leanne Jade G.; Nhan, Melissa; Adams, Paul D.

    2015-01-01

    Realizing the promise of metabolic engineering has been slowed by challenges related to moving beyond proof-of-concept examples to robust and economically viable systems. Key to advancing metabolic engineering beyond trial-and-error research is access to parts with well-defined performance metrics that can be readily applied in vastly different contexts with predictable effects. As the field now stands, research depends greatly on analytical tools that assay target molecules, transcripts, proteins, and metabolites across different hosts and pathways. Screening technologies yield specific information for many thousands of strain variants, while deep omics analysis provides a systems-level view of the cell factory. Efforts focused on a combination of these analyses yield quantitative information of dynamic processes between parts and the host chassis that drive the next engineering steps. Overall, the data generated from these types of assays aid better decision-making at the design and strain construction stages to speed progress in metabolic engineering research. PMID:26442249

  4. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  5. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  6. SNL software manual for the ACS Data Analytics Project.

    SciTech Connect

    Stearley, Jon R.; McLendon, William Clarence, III; Rodrigues, Arun F.; Williams, Aaron S.; Hooper, Russell Warren; Robinson, David Gerald; Stickland, Michael G.

    2011-10-01

    In the ACS Data Analytics Project (also known as 'YumYum'), a supercomputer is modeled as a graph of components and dependencies, jobs and faults are simulated, and component fault rates are estimated using the graph structure and job pass/fail outcomes. This report documents the successful completion of all SNL deliverables and tasks, describes the software written by SNL for the project, and presents the data it generates. Readers should understand what the software tools are, how they fit together, and how to use them to reproduce the presented data and additional experiments as desired. The SNL YumYum tools provide the novel simulation and inference capabilities desired by ACS. SNL also developed and implemented a new algorithm, which provides faster estimates, at finer component granularity, on arbitrary directed acyclic graphs.

  7. Immediate tool incorporation processes determine human motor planning with tools.

    PubMed

    Ganesh, G; Yoshioka, T; Osu, R; Ikegami, T

    2014-01-01

    Human dexterity with tools is believed to stem from our ability to incorporate and use tools as parts of our body. However tool incorporation, evident as extensions in our body representation and peri-personal space, has been observed predominantly after extended tool exposures and does not explain our immediate motor behaviours when we change tools. Here we utilize two novel experiments to elucidate the presence of additional immediate tool incorporation effects that determine motor planning with tools. Interestingly, tools were observed to immediately induce a trial-by-trial, tool length dependent shortening of the perceived limb lengths, opposite to observations of elongations after extended tool use. Our results thus exhibit that tools induce a dual effect on our body representation; an immediate shortening that critically affects motor planning with a new tool, and the slow elongation, probably a consequence of skill related changes in sensory-motor mappings with the repeated use of the tool. PMID:25077612

  8. Optimization of reversed-phase chromatography methods for peptide analytics.

    PubMed

    Khalaf, Rushd; Baur, Daniel; Pfister, David

    2015-12-18

    The analytical description and quantification of peptide solutions is an essential part in the quality control of peptide production processes and in peptide mapping techniques. Traditionally, an important tool is analytical reversed phase liquid chromatography. In this work, we develop a model-based tool to find optimal analytical conditions in a clear, efficient and robust manner. The model, based on the Van't Hoff equation, the linear solvent strength correlation, and an analytical solution of the mass balance on a chromatographic column describing peptide retention in gradient conditions is used to optimize the analytical scale separation between components in a peptide mixture. The proposed tool is then applied in the design of analytical reversed phase liquid chromatography methods of five different peptide mixtures. PMID:26620597

  9. ARVO-CL: The OpenCL version of the ARVO package — An efficient tool for computing the accessible surface area and the excluded volume of proteins via analytical equations

    NASA Astrophysics Data System (ADS)

    Buša, Ján; Hayryan, Shura; Wu, Ming-Chya; Buša, Ján; Hu, Chin-Kun

    2012-11-01

    Introduction of Graphical Processing Units (GPUs) and computing using GPUs in recent years opened possibilities for simple parallelization of programs. In this update, we present the modernized version of program ARVO [J. Buša, J. Dzurina, E. Hayryan, S. Hayryan, C.-K. Hu, J. Plavka, I. Pokorný, J. Skivánek, M.-C. Wu, Comput. Phys. Comm. 165 (2005) 59]. The whole package has been rewritten in the C language and parallelized using OpenCL. Some new tricks have been added to the algorithm in order to save memory much needed for efficient usage of graphical cards. A new tool called ‘input_structure’ was added for conversion of pdb files into files suitable for work with the C and OpenCL version of ARVO.

  10. Java Tool Retirement

    Atmospheric Science Data Center

    2014-05-15

    ... 08:00 am EDT Event Impact:  The ASDC Java Order Tool was officially retired on Wednesday, May 14, 2014.  The HTML Order Tool and additional options are available from our Order Data Page .  Please update all bookmarks.     ...

  11. Lynx: a knowledge base and an analytical workbench for integrative medicine.

    PubMed

    Sulakhe, Dinanath; Xie, Bingqing; Taylor, Andrew; D'Souza, Mark; Balasubramanian, Sandhya; Hashemifar, Somaye; White, Steven; Dave, Utpal J; Agam, Gady; Xu, Jinbo; Wang, Sheng; Gilliam, T Conrad; Maltsev, Natalia

    2016-01-01

    Lynx (http://lynx.ci.uchicago.edu) is a web-based database and a knowledge extraction engine. It supports annotation and analysis of high-throughput experimental data and generation of weighted hypotheses regarding genes and molecular mechanisms contributing to human phenotypes or conditions of interest. Since the last release, the Lynx knowledge base (LynxKB) has been periodically updated with the latest versions of the existing databases and supplemented with additional information from public databases. These additions have enriched the data annotations provided by Lynx and improved the performance of Lynx analytical tools. Moreover, the Lynx analytical workbench has been supplemented with new tools for reconstruction of co-expression networks and feature-and-network-based prioritization of genetic factors and molecular mechanisms. These developments facilitate the extraction of meaningful knowledge from experimental data and LynxKB. The Service Oriented Architecture provides public access to LynxKB and its analytical tools via user-friendly web services and interfaces. PMID:26590263

  12. Science Update: Analytical Chemistry.

    ERIC Educational Resources Information Center

    Worthy, Ward

    1980-01-01

    Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)

  13. Analytical Chemistry in Industry.

    ERIC Educational Resources Information Center

    Kaiser, Mary A.; Ullman, Alan H.

    1988-01-01

    Clarifies the roles of a practicing analytical chemist in industry: quality control, methods and technique development, troubleshooting, research, and chemical analysis. Lists criteria for success in industry. (ML)

  14. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  15. Analytical Aspects of the Implementation of Biomarkers in Clinical Transplantation.

    PubMed

    Shipkova, Maria; López, Olga Millán; Picard, Nicolas; Noceti, Ofelia; Sommerer, Claudia; Christians, Uwe; Wieland, Eberhard

    2016-04-01

    In response to the urgent need for new reliable biomarkers to complement the guidance of the immunosuppressive therapy, a huge number of biomarker candidates to be implemented in clinical practice have been introduced to the transplant community. This includes a diverse range of molecules with very different molecular weights, chemical and physical properties, ex vivo stabilities, in vivo kinetic behaviors, and levels of similarity to other molecules, etc. In addition, a large body of different analytical techniques and assay protocols can be used to measure biomarkers. Sometimes, a complex software-based data evaluation is a prerequisite for appropriate interpretation of the results and for their reporting. Although some analytical procedures are of great value for research purposes, they may be too complex for implementation in a clinical setting. Whereas the proof of "fitness for purpose" is appropriate for validation of biomarker assays used in exploratory drug development studies, a higher level of analytical validation must be achieved and eventually advanced analytical performance might be necessary before diagnostic application in transplantation medicine. A high level of consistency of results between laboratories and between methods (if applicable) should be obtained and maintained to make biomarkers effective instruments in support of therapeutic decisions. This overview focuses on preanalytical and analytical aspects to be considered for the implementation of new biomarkers for adjusting immunosuppression in a clinical setting and highlights critical points to be addressed on the way to make them suitable as diagnostic tools. These include but are not limited to appropriate method validation, standardization, education, automation, and commercialization. PMID:26418704

  16. Standardization guide for construction and use of MORT-type analytic trees. Revision 1

    SciTech Connect

    Buys, J.R.

    1992-02-01

    Since the introduction of MORT (Management Oversight and Risk Tree) technology as a tool for evaluating the success or failure of safety management systems, there has been a proliferation of analytic trees throughout US Department of Energy (DOE) and its contractor organizations. Standard ``fault tree`` symbols have generally been used in logic diagram or tree construction, but new or revised symbols have also been adopted by various analysts. Additionally, a variety of numbering systems have been used for event identification. The consequent lack of standardization has caused some difficulties in interpreting the trees and following their logic. This guide seeks to correct this problem by providing a standardized system for construction and use of analytic trees. Future publications of the DOE System Safety Development Center (SSDC) will adhere to this guide. It is recommended that other DOE organizations and contractors also adopt this system to achieve intra-DOE uniformity in analytic tree construction.

  17. Standardization guide for construction and use of MORT-type analytic trees

    SciTech Connect

    Buys, J.R.

    1992-02-01

    Since the introduction of MORT (Management Oversight and Risk Tree) technology as a tool for evaluating the success or failure of safety management systems, there has been a proliferation of analytic trees throughout US Department of Energy (DOE) and its contractor organizations. Standard fault tree'' symbols have generally been used in logic diagram or tree construction, but new or revised symbols have also been adopted by various analysts. Additionally, a variety of numbering systems have been used for event identification. The consequent lack of standardization has caused some difficulties in interpreting the trees and following their logic. This guide seeks to correct this problem by providing a standardized system for construction and use of analytic trees. Future publications of the DOE System Safety Development Center (SSDC) will adhere to this guide. It is recommended that other DOE organizations and contractors also adopt this system to achieve intra-DOE uniformity in analytic tree construction.

  18. Shifting tools

    SciTech Connect

    Fisher, E.P.; Welch, W.R.

    1984-03-13

    An improved shifting tool connectable in a well tool string and useful to engage and position a slidable sleeve in a sliding sleeve device in a well flow conductor. The selectively profiled shifting tool keys provide better fit with and more contact area between keys and slidable sleeves. When the engaged slidable sleeve cannot be moved up and the shifting tool is not automatically disengaged, emergency disengagement means may be utilized by applying upward force to the shifting tool sufficient to shear pins and cause all keys to be cammed inwardly at both ends to completely disengage for removal of the shifting tool from the sliding sleeve device.

  19. Process Analytical Chemistry.

    ERIC Educational Resources Information Center

    Callis, James B.; And Others

    1987-01-01

    Discusses process analytical chemistry as a discipline designed to supply quantitative and qualitative information about a chemical process. Encourages academic institutions to examine this field for employment opportunities for students. Describes the five areas of process analytical chemistry, including off-line, at-line, on-line, in-line, and…

  20. Extreme Scale Visual Analytics

    SciTech Connect

    Wong, Pak C.; Shen, Han-Wei; Pascucci, Valerio

    2012-05-08

    Extreme-scale visual analytics (VA) is about applying VA to extreme-scale data. The articles in this special issue examine advances related to extreme-scale VA problems, their analytical and computational challenges, and their real-world applications.

  1. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  2. Analytical mass spectrometry

    SciTech Connect

    Not Available

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  3. Analytical mass spectrometry. Abstracts

    SciTech Connect

    Not Available

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  4. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  5. Teaching the Analytical Life

    ERIC Educational Resources Information Center

    Jackson, Brian

    2010-01-01

    Using a survey of 138 writing programs, I argue that we must be more explicit about what we think students should get out of analysis to make it more likely that students will transfer their analytical skills to different settings. To ensure our students take analytical skills with them at the end of the semester, we must simplify the task we…

  6. Signals: Applying Academic Analytics

    ERIC Educational Resources Information Center

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  7. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  8. OOTW Force Design Tools

    SciTech Connect

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  9. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed. PMID:26631024

  10. Advances in analytical technologies for environmental protection and public safety.

    PubMed

    Sadik, O A; Wanekaya, A K; Andreescu, S

    2004-06-01

    Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies. PMID:15173903

  11. New and emerging analytical techniques for marine biotechnology.

    PubMed

    Burgess, J Grant

    2012-02-01

    Marine biotechnology is the industrial, medical or environmental application of biological resources from the sea. Since the marine environment is the most biologically and chemically diverse habitat on the planet, marine biotechnology has, in recent years delivered a growing number of major therapeutic products, industrial and environmental applications and analytical tools. These range from the use of a snail toxin to develop a pain control drug, metabolites from a sea squirt to develop an anti-cancer therapeutic, and marine enzymes to remove bacterial biofilms. In addition, well known and broadly used analytical techniques are derived from marine molecules or enzymes, including green fluorescence protein gene tagging methods and heat resistant polymerases used in the polymerase chain reaction. Advances in bacterial identification, metabolic profiling and physical handling of cells are being revolutionised by techniques such as mass spectrometric analysis of bacterial proteins. Advances in instrumentation and a combination of these physical advances with progress in proteomics and bioinformatics are accelerating our ability to harness biology for commercial gain. Single cell Raman spectroscopy and microfluidics are two emerging techniques which are also discussed elsewhere in this issue. In this review, we provide a brief survey and update of the most powerful and rapidly growing analytical techniques as used in marine biotechnology, together with some promising examples of less well known earlier stage methods which may make a bigger impact in the future. PMID:22265377

  12. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  13. Percussion tool

    SciTech Connect

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  14. Tool steels. 5. edition

    SciTech Connect

    Roberts, G.; Krauss, G.; Kennedy, R.

    1998-12-31

    The revision of this authoritative work contains a significant amount of new information from the past nearly two decades presented in an entirely new outline, making this a must have reference for engineers involved in tool-steel production, as well as in the selection and use of tool steels in metalworking and other materials manufacturing industries. The chapter on tool-steel manufacturing includes new production processes, such as electroslag refining, vacuum arc remelting, spray deposition processes (Osprey and centrifugal spray), and powder metal processing. The seven chapters covering tool-steel types in the 4th Edition have been expanded to 11 chapters covering nine main groups of tool steels as well as other types of ultrahigh strength steels sometimes used for tooling. Each chapter discusses in detail processing, composition, and applications specific to the particular group. In addition, two chapters have been added covering surface modification and trouble shooting production and performance problems.

  15. Design sensitivity derivatives for isoparametric elements by analytical and semi-analytical approaches

    NASA Technical Reports Server (NTRS)

    Zumwalt, Kenneth W.; El-Sayed, Mohamed E. M.

    1990-01-01

    This paper presents an analytical approach for incorporating design sensitivity calculations directly into the finite element analysis. The formulation depends on the implicit differentiation approach and requires few additional calculations to obtain the design sensitivity derivatives. In order to evaluate this approach, it is compared with the semi-analytical approach which is based on commonly used finite difference formulations. Both approaches are implemented to calculate the design sensitivities for continuum and structural isoparametric elements. To demonstrate the accuracy and robustness of the developed analytical approach compared to the semi-analytical approach, some test cases using different structural and continuum element types are presented.

  16. Analytical laboratory quality audits

    SciTech Connect

    Kelley, William D.

    2001-06-11

    Analytical Laboratory Quality Audits are designed to improve laboratory performance. The success of the audit, as for many activities, is based on adequate preparation, precise performance, well documented and insightful reporting, and productive follow-up. Adequate preparation starts with definition of the purpose, scope, and authority for the audit and the primary standards against which the laboratory quality program will be tested. The scope and technical processes involved lead to determining the needed audit team resources. Contact is made with the auditee and a formal audit plan is developed, approved and sent to the auditee laboratory management. Review of the auditee's quality manual, key procedures and historical information during preparation leads to better checklist development and more efficient and effective use of the limited time for data gathering during the audit itself. The audit begins with the opening meeting that sets the stage for the interactions between the audit team and the laboratory staff. Arrangements are worked out for the necessary interviews and examination of processes and records. The information developed during the audit is recorded on the checklists. Laboratory management is kept informed of issues during the audit so there are no surprises at the closing meeting. The audit report documents whether the management control systems are effective. In addition to findings of nonconformance, positive reinforcement of exemplary practices provides balance and fairness. Audit closure begins with receipt and evaluation of proposed corrective actions from the nonconformances identified in the audit report. After corrective actions are accepted, their implementation is verified. Upon closure of the corrective actions, the audit is officially closed.

  17. Liquid chromatography coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry and post-column addition of metal salt solutions as a powerful tool for the metabolic profiling of Fusarium oxysporum.

    PubMed

    Cirigliano, Adriana M; Rodriguez, M Alejandra; Gagliano, M Laura; Bertinetti, Brenda V; Godeas, Alicia M; Cabrera, Gabriela M

    2016-03-25

    Fusarium oxysporum L11 is a non-pathogenic soil-borne fungal strain that yielded an extract that showed antifungal activity against phytopathogens. In this study, reversed-phase high-performance liquid chromatography (RP-HPLC) coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry (API-QTOF-MS) was applied for the comprehensive profiling of the metabolites from the extract. The employed sources were electrospray (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI). Post-column addition of metal solutions of Ca, Cu and Zn(II) was also tested using ESI. A total of 137 compounds were identified or tentatively identified by matching their accurate mass signals, suggested molecular formulae and MS/MS analysis with previously reported data. Some compounds were isolated and identified by NMR. The extract was rich in cyclic peptides like cyclosporins, diketopiperazines and sansalvamides, most of which were new, and are reported here for the first time. The use of post-column addition of metals resulted in a useful strategy for the discrimination of compound classes since specific adducts were observed for the different compound families. This technique also allowed the screening for compounds with metal binding properties. Thus, the applied methodology is a useful choice for the metabolic profiling of extracts and also for the selection of metabolites with potential biological activities related to interactions with metal ions. PMID:26655791

  18. Investigation into the phenomena affecting the retention behavior of basic analytes in chaotropic chromatography: Joint effects of the most relevant chromatographic factors and analytes' molecular properties.

    PubMed

    Čolović, Jelena; Kalinić, Marko; Vemić, Ana; Erić, Slavica; Malenović, Anđelija

    2015-12-18

    The aim of this study was to systematically investigate the phenomena affecting the retention behavior of structurally diverse basic drugs in ion-interaction chromatographic systems with chaotropic additives. To this end, the influence of three factors was studied: pH value of the aqueous phase, concentration of sodium hexafluorophosphate, and content of acetonitrile in the mobile phase. Mobile phase pH was found to affect the thermodynamic equilibria in the studied system beyond its effects on the analytes' ionization state. Specifically, increasing pH from 2 to 4 led to longer retention times, even with analytes which remain completely protonated. An explanation for this phenomenon was sought by studying the adsorption behavior of acetonitrile and chaotropic additive onto stationary phase. It was shown that the magnitude of the developed surface potential, which significantly affects retention - increases with pH, and that this can be attributed to the larger surface excess of acetonitrile. To study how analytes' structural properties influence their retention, quantitative structure-retention modeling was performed next. A support vector machine regression model was developed, relating mobile phase constituents and structural descriptors with retention data. While the ETA_EtaP_B_RC and XlogP can be considered as molecular descriptors which describe factors affecting retention in any RP-HPLC system, TDB9p and RDF45p are molecular descriptors which account for spatial arrangement of polarizable atoms and they can clearly relate to analytes' behavior on the stationary phase surface, where the electrostatic potential develops. Complementarity of analytes' structure with that of the electric double layer can be seen as a key factor influencing their retention behavior. Structural diversity of analytes and good predictive capabilities over a range of experimental conditions make the established model a useful tool in predicting retention behavior in the studied

  19. Enzymes in Analytical Chemistry.

    ERIC Educational Resources Information Center

    Fishman, Myer M.

    1980-01-01

    Presents tabular information concerning recent research in the field of enzymes in analytic chemistry, with methods, substrate or reaction catalyzed, assay, comments and references listed. The table refers to 128 references. Also listed are 13 general citations. (CS)

  20. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  1. FT-Raman and chemometric tools for rapid determination of quality parameters in milk powder: Classification of samples for the presence of lactose and fraud detection by addition of maltodextrin.

    PubMed

    Rodrigues Júnior, Paulo Henrique; de Sá Oliveira, Kamila; de Almeida, Carlos Eduardo Rocha; De Oliveira, Luiz Fernando Cappa; Stephani, Rodrigo; Pinto, Michele da Silva; de Carvalho, Antônio Fernandes; Perrone, Ítalo Tuler

    2016-04-01

    FT-Raman spectroscopy has been explored as a quick screening method to evaluate the presence of lactose and identify milk powder samples adulterated with maltodextrin (2.5-50% w/w). Raman measurements can easily differentiate samples of milk powder, without the need for sample preparation, while traditional quality control methods, including high performance liquid chromatography, are cumbersome and slow. FT-Raman spectra were obtained from samples of whole lactose and low-lactose milk powder, both without and with addition of maltodextrin. Differences were observed between the spectra involved in identifying samples with low lactose content, as well as adulterated samples. Exploratory data analysis using Raman spectroscopy and multivariate analysis was also developed to classify samples with PCA and PLS-DA. The PLS-DA models obtained allowed to correctly classify all samples. These results demonstrate the utility of FT-Raman spectroscopy in combination with chemometrics to infer about the quality of milk powder. PMID:26593531

  2. Extreme Scale Visual Analytics

    SciTech Connect

    Steed, Chad A; Potok, Thomas E; Pullum, Laura L; Ramanathan, Arvind; Shipman, Galen M; Thornton, Peter E

    2013-01-01

    Given the scale and complexity of today s data, visual analytics is rapidly becoming a necessity rather than an option for comprehensive exploratory analysis. In this paper, we provide an overview of three applications of visual analytics for addressing the challenges of analyzing climate, text streams, and biosurveilance data. These systems feature varying levels of interaction and high performance computing technology integration to permit exploratory analysis of large and complex data of global significance.

  3. Tool Using

    PubMed Central

    Kahrs, Björn A.; Lockman, Jeffrey J.

    2014-01-01

    Research on the development of tool use in children has often emphasized the cognitive bases of this achievement, focusing on the choice of an artifact, but has largely neglected its motor foundations. However, research across diverse fields, from evolutionary anthropology to cognitive neuroscience, converges on the idea that the actions that embody tool use are also critical for understanding its ontogenesis and phylogenesis. In this article, we highlight findings across these fields to show how a deeper examination of the act of tool using can inform developmental accounts and illuminate what makes human tool use unique. PMID:25400691

  4. Ootw Tool Requirements in Relation to JWARS

    SciTech Connect

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  5. Enterprise integration: A tool`s perspective

    SciTech Connect

    Polito, J.; Jones, A.; Grant, H.

    1993-06-01

    The advent of sophisticated automation equipment and computer hardware and software is changing the way manufacturing is carried out. To compete in the global marketplace, manufacturing companies must integrate these new technologies into their factories. In addition, they must integrate the planning, control, and data management methodologies needed to make effective use of these technologies. This paper provides an overview of recent approaches to achieving this enterprise integration. It then describes, using simulation as a particular example, a new tool`s perspective of enterprise integration.

  6. Developing Guidelines for Assessing Visual Analytics Environments

    SciTech Connect

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domains and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.

  7. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    SciTech Connect

    Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  8. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    SciTech Connect

    Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  9. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    SciTech Connect

    Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.

    2013-01-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today s increasing complex, multivariate data sets. In this paper, a visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today s data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. This chapter provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  10. Tool use by aquatic animals.

    PubMed

    Mann, Janet; Patterson, Eric M

    2013-11-19

    Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour. PMID:24101631

  11. Tool use by aquatic animals

    PubMed Central

    Mann, Janet; Patterson, Eric M.

    2013-01-01

    Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour. PMID:24101631

  12. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  13. Omics Tools

    SciTech Connect

    Schaumberg, Andrew

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not contain Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args

  14. Omics Tools

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not containmore » Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args« less

  15. Applications of Computers and Computer Software in Teaching Analytical Chemistry.

    ERIC Educational Resources Information Center

    O'Haver, T. C.

    1991-01-01

    Some commercially available software tools that have potential applications in the analytical chemistry curriculum are surveyed and evaluated. Tools for instruction, analysis and research, and courseware development are described. A list of the software packages, the compatible hardware, and the vendor's address is included. (KR)

  16. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  17. Metabolomics and Diabetes: Analytical and Computational Approaches

    PubMed Central

    Sas, Kelli M.; Karnovsky, Alla; Michailidis, George

    2015-01-01

    Diabetes is characterized by altered metabolism of key molecules and regulatory pathways. The phenotypic expression of diabetes and associated complications encompasses complex interactions between genetic, environmental, and tissue-specific factors that require an integrated understanding of perturbations in the network of genes, proteins, and metabolites. Metabolomics attempts to systematically identify and quantitate small molecule metabolites from biological systems. The recent rapid development of a variety of analytical platforms based on mass spectrometry and nuclear magnetic resonance have enabled identification of complex metabolic phenotypes. Continued development of bioinformatics and analytical strategies has facilitated the discovery of causal links in understanding the pathophysiology of diabetes and its complications. Here, we summarize the metabolomics workflow, including analytical, statistical, and computational tools, highlight recent applications of metabolomics in diabetes research, and discuss the challenges in the field. PMID:25713200

  18. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  19. Frontiers in analytical chemistry

    SciTech Connect

    Amato, I.

    1988-12-15

    Doing more with less was the modus operandi of R. Buckminster Fuller, the late science genius, and inventor of such things as the geodesic dome. In late September, chemists described their own version of this maxim--learning more chemistry from less material and in less time--in a symposium titled Frontiers in Analytical Chemistry at the 196th National Meeting of the American Chemical Society in Los Angeles. Symposium organizer Allen J. Bard of the University of Texas at Austin assembled six speakers, himself among them, to survey pretty widely different areas of analytical chemistry.

  20. Graphical Contingency Analysis Tool

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identifymore » the best action by interactively evaluate candidate actions.« less

  1. Jupiter Environment Tool

    NASA Technical Reports Server (NTRS)

    Sturm, Erick J.; Monahue, Kenneth M.; Biehl, James P.; Kokorowski, Michael; Ngalande, Cedrick,; Boedeker, Jordan

    2012-01-01

    The Jupiter Environment Tool (JET) is a custom UI plug-in for STK that provides an interface to Jupiter environment models for visualization and analysis. Users can visualize the different magnetic field models of Jupiter through various rendering methods, which are fully integrated within STK s 3D Window. This allows users to take snapshots and make animations of their scenarios with magnetic field visualizations. Analytical data can be accessed in the form of custom vectors. Given these custom vectors, users have access to magnetic field data in custom reports, graphs, access constraints, coverage analysis, and anywhere else vectors are used within STK.

  2. Generalization of analytical tools for helicopter-rotor airfoils

    NASA Technical Reports Server (NTRS)

    Gibbs, E. H.

    1979-01-01

    A state-of-the-art finite difference boundary-layer program incorporated into the NYU Transonic Analysis Program is described. Some possible treatments for the trailing edge region were investigated. Findings indicate the trailing edge region, still within the scope of an iterative potential flow, boundary layer program, appears feasible.

  3. Monitoring automotive oil degradation: analytical tools and onboard sensing technologies.

    PubMed

    Mujahid, Adnan; Dickert, Franz L

    2012-09-01

    Engine oil experiences a number of thermal and oxidative phases that yield acidic products in the matrix consequently leading to degradation of the base oil. Generally, oil oxidation is a complex process and difficult to elucidate; however, the degradation pathways can be defined for almost every type of oil because they mainly depend on the mechanical status and operating conditions. The exact time of oil change is nonetheless difficult to predict, but it is of great interest from an economic and ecological point of view. In order to make a quick and accurate decision about oil changes, onboard assessment of oil quality is highly desirable. For this purpose, a variety of physical and chemical sensors have been proposed along with spectroscopic strategies. We present a critical review of all these approaches and of recent developments to analyze the exact lifetime of automotive engine oil. Apart from their potential for degradation monitoring, their limitations and future perspectives have also been investigated. PMID:22752447

  4. The t expansion: A nonperturbative analytic tool for Hamiltonian systems

    NASA Astrophysics Data System (ADS)

    Horn, D.; Weinstein, M.

    1984-09-01

    A systematic nonperturbative scheme is developed to calculate the ground-state expectation values of arbitrary operators for any Hamiltonian system. Quantities computed in this way converge rapidly to their true expectation values. The method is based upon the use of the operator e-tH to contract any trial state onto the true ground state of the Hamiltonian H. We express all expectation values in the contracted state as a power series in t, and reconstruct t-->∞ behavior by means of Padé approximants. The problem associated with factors of spatial volume is taken care of by developing a connected graph expansion for matrix elements of arbitrary operators taken between arbitrary states. We investigate Padé methods for the t series and discuss the merits of various procedures. As examples of the power of this technique we present results obtained for the Heisenberg and Ising models in 1+1 dimensions starting from simple mean-field wave functions. The improvement upon mean-field results is remarkable for the amount of effort required. The connection between our method and conventional perturbation theory is established, and a generalization of the technique which allows us to exploit off-diagonal matrix elements is introduced. The bistate procedure is used to develop a t expansion for the ground-state energy of the Ising model which is, term by term, self-dual.

  5. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) USER MANUAL

    EPA Science Inventory

    ATtlLA is an ArcView extension that allows users to easily calculate many common landscape metrics. GIS expertise is not required, but some experience with ArcView is recommended. Four metric groups are currently included in ATtILA: landscape characteristics, riparian characteris...

  6. Immunoassay as an analytical tool in agricultural biotechnology.

    PubMed

    Grothaus, G David; Bandla, Murali; Currier, Thomas; Giroux, Randal; Jenkins, G Ronald; Lipp, Markus; Shan, Guomin; Stave, James W; Pantella, Virginia

    2006-01-01

    Immunoassays for biotechnology engineered proteins are used by AgBiotech companies at numerous points in product development and by feed and food suppliers for compliance and contractual purposes. Although AgBiotech companies use the technology during product development and seed production, other stakeholders from the food and feed supply chains, such as commodity, food, and feed companies, as well as third-party diagnostic testing companies, also rely on immunoassays for a number of purposes. The primary use of immunoassays is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of GM analysis using immunoassays and especially its application to the testing of grains. The 2 most commonly used formats are lateral flow devices (LFD) and plate-based enzyme-linked immunosorbent assays (ELISA). The main applications of both formats are discussed in general, and the benefits and drawbacks are discussed in detail. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effects they may have on the accuracy of the immunoassays. PMID:16915826

  7. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  8. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  9. Developing SABRE as an analytical tool in NMR

    NASA Astrophysics Data System (ADS)

    Lloyd, Lyrelle Stacey

    Work presented in this thesis centres around the application of the new hyperpolarisation technique, SABRE, within nuclear magnetic resonance spectroscopy, focusing on optimisation of the technique to characterise small organic molecules. While pyridine was employed as a model substrate, studies on a range of molecules are investigated including substituted pyridines, quinolines, thiazoles and indoles are detailed. Initial investigations explored how the properties of the SABRE catalyst effect the extent of polarisation transfer exhibited. The most important of these properties proved to be the rate constants for loss of pyridine and hydrides as these define the contact time of pyridine with the parahydrogen derived hydride ligands in the metal template. The effect of changing the temperature, solvent or concentration of substrate or catalyst are rationalised. For instance, the catalyst ICy(a) exhibits relatively slow ligand exchange rates and increasing the temperature during hyperpolarisation increases the observed signal enhancements. These studies have revealed a second polarisation transfer template can be used with SABRE in which two substrate molecules are bound. This allows the possibility of investigation of larger substrates which might otherwise be too sterically encumbered to bind. Another significant advance relates to the first demonstration that SABRE can be used in conjunction with an automated system designed with Bruker allowing the acquisition of scan averaged, phase cycled and traditional 2D spectra. The system also allowed investigations into the effect of the polarisation transfer field and application of that knowledge to collect single-scan 13C data for characterisation. The successful acquisition of 1H NOESY, 1H-1H COSY, 1H-13C 2D and ultrafast 1H-1H COSY NMR sequences is detailed for a 10 mM concentration sample, with 1H data collected for a 1 mM sample. A range of studies which aim to demonstrate the applicability of SABRE to the characterisation of small molecules and pharmaceuticals have been conducted.

  10. Summary of NDE of Additive Manufacturing Efforts in NASA

    NASA Technical Reports Server (NTRS)

    Waller, Jess; Saulsberry, Regor; Parker, Bradford; Hodges, Kenneth; Burke, Eric; Taminger, Karen

    2014-01-01

    (1) General Rationale for Additive Manufacturing (AM): (a) Operate under a 'design-to-constraint' paradigm, make parts too complicated to fabricate otherwise, (b) Reduce weight by 20 percent with monolithic parts, (c) Reduce waste (green manufacturing), (e) Eliminate reliance on Original Equipment Manufacturers for critical spares, and (f) Extend life of in-service parts by innovative repair methods; (2) NASA OSMA NDE of AM State-of-the-Discipline Report; (3) Overview of NASA AM Efforts at Various Centers: (a) Analytical Tools, (b) Ground-Based Fabrication (c) Space-Based Fabrication; and (d) Center Activity Summaries; (4) Overview of NASA NDE data to date on AM parts; and (5) Gap Analysis/Recommendations for NDE of AM.

  11. Drilling tool

    SciTech Connect

    Baumann, O.; Dohse, H.P.; Reibetanz, W.; Wanner, K.

    1983-09-27

    A drilling tool is disclosed which has a drilling shaft member, a crown drilling member with an annular wall provided with a plurality of cutting edges and detachably mounted on the shaft member, a center drilling member detachably mounted on the shaft member inside the crown drilling member and having a further cutting edge, and elements for limiting a drilling depth of the tool when the center drilling member is mounted on the shaft member. Thereby, the operator of the drilling tool, after drilling a guiding groove in a rock, is forced to remove the center drilling member from the drilling tool and drill further without the center drilling member, which increases the drilling efficiency.

  12. Analytical Services Management System

    SciTech Connect

    Church, Shane; Nigbor, Mike; Hillman, Daniel

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standard chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.

  13. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  14. Analytical Services Management System

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standardmore » chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.« less

  15. Analytics: Changing the Conversation

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2013-01-01

    In this third and concluding discussion on analytics, the author notes that we live in an information culture. We are accustomed to having information instantly available and accessible, along with feedback and recommendations. We want to know what people think and like (or dislike). We want to know how we compare with "others like me."…

  16. Social Learning Analytics

    ERIC Educational Resources Information Center

    Buckingham Shum, Simon; Ferguson, Rebecca

    2012-01-01

    We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers. Online social…

  17. Ada & the Analytical Engine.

    ERIC Educational Resources Information Center

    Freeman, Elisabeth

    1996-01-01

    Presents a brief history of Ada Byron King, Countess of Lovelace, focusing on her primary role in the development of the Analytical Engine--the world's first computer. Describes the Ada Project (TAP), a centralized World Wide Web site that serves as a clearinghouse for information related to women in computing, and provides a Web address for…

  18. Analytical Instrument Obsolescence Examined.

    ERIC Educational Resources Information Center

    Haggin, Joseph

    1982-01-01

    The threat of instrument obsolescence and tight federal budgets have conspired to threaten the existence of research analytical laboratories. Despite these and other handicaps most existing laboratories expect to keep operating in support of basic research, though there may be serious penalties in the future unless funds are forthcoming. (Author)

  19. Tool to Prioritize Energy Efficiency Investments

    SciTech Connect

    Farese, P.; Gelman, R.; Hendron, R.

    2012-08-01

    To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.

  20. Predictive Data Tools Find Uses in Schools

    ERIC Educational Resources Information Center

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  1. Imaging and Analytics: The changing face of Medical Imaging

    NASA Astrophysics Data System (ADS)

    Foo, Thomas

    There have been significant technological advances in imaging capability over the past 40 years. Medical imaging capabilities have developed rapidly, along with technology development in computational processing speed and miniaturization. Moving to all-digital, the number of images that are acquired in a routine clinical examination has increased dramatically from under 50 images in the early days of CT and MRI to more than 500-1000 images today. The staggering number of images that are routinely acquired poses significant challenges for clinicians to interpret the data and to correctly identify the clinical problem. Although the time provided to render a clinical finding has not substantially changed, the amount of data available for interpretation has grown exponentially. In addition, the image quality (spatial resolution) and information content (physiologically-dependent image contrast) has also increased significantly with advances in medical imaging technology. On its current trajectory, medical imaging in the traditional sense is unsustainable. To assist in filtering and extracting the most relevant data elements from medical imaging, image analytics will have a much larger role. Automated image segmentation, generation of parametric image maps, and clinical decision support tools will be needed and developed apace to allow the clinician to manage, extract and utilize only the information that will help improve diagnostic accuracy and sensitivity. As medical imaging devices continue to improve in spatial resolution, functional and anatomical information content, image/data analytics will be more ubiquitous and integral to medical imaging capability.

  2. PV Hourly Simulation Tool

    SciTech Connect

    Dean, Jesse; Metzger, Ian

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes the option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  3. PV Hourly Simulation Tool

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes themore » option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.« less

  4. Authoring Tools

    NASA Astrophysics Data System (ADS)

    Treviranus, Jutta

    Authoring tools that are accessible and that enable authors to produce accessible Web content play a critical role in web accessibility. Widespread use of authoring tools that comply to the W3C Authoring Tool Accessibility Guidelines (ATAG) would ensure that even authors who are neither knowledgeable about nor particularly motivated to produce accessible content do so by default. The principles and techniques of ATAG are discussed. Some examples of accessible authoring tools are described including authoring tool content management components such as TinyMCE. Considerations for creating an accessible collaborative environment are also covered. As part of providing accessible content, the debate between system-based personal optimization and one universally accessible site configuration is presented. The issues and potential solutions to address the accessibility crisis presented by the advent of rich internet applications are outlined. This challenge must be met to ensure that a large segment of the population is able to participate in the move toward the web as a two-way communication mechanism.

  5. Well bore tools

    SciTech Connect

    Burge, E.V.

    1984-08-28

    Well bore tools configured as centralizers/stabilizers, well bore reamers, and keyseat wipers each of which includes an elongate tubular body having a generally cylindrical outer surface and a diameter approximately equal to the diameter of the borehole being drilled are disclosed. Each tool affords an improved mode of drilling a borehole by increasing downhole directional control and stability, increasing tool wear reliability, and reducing return mud flow resistance. The outer surface of each tool has a plurality of longitudinal passages formed in pairs of upright intersecting right and left hand helicies or spirals about the exterior of said tool and extending from one end to the other end thereof. The intersecting right and left hand helical or spiral channels form raised pad areas therebetween to provide 360/sup 0/ contiguous well bore contact by each tool for enhanced stability and efficiency. In addition, the intersecting right and left hand helical channels afford greater surficial engagement area while providing unobstructed return mud flow paths between each tool and the wall of the borehole. The raised pad areas may have wear resistant surfaces which are arranged in a configuration for affording constant 360/sup 0/ contiguous contact with the wall of the borehole. Preferably, the wear resistance surfaces are provided by replaceable inserts mounted in recesses in the pad areas.

  6. Analytical boron diffusivity model in silicon for thermal diffusion from boron silicate glass film

    NASA Astrophysics Data System (ADS)

    Kurachi, Ikuo; Yoshioka, Kentaro

    2015-09-01

    An analytical boron diffusivity model in silicon for thermal diffusion from a boron silicate glass (BSG) film has been proposed in terms of enhanced diffusion due to boron-silicon interstitial pair formation. The silicon interstitial generation is considered to be a result of the silicon kick-out mechanism by the diffused boron at the surface. The additional silicon interstitial generation in the bulk silicon is considered to be the dissociation of the diffused pairs. The former one causes the surface boron concentration dependent diffusion. The latter one causes the local boron concentration dependent diffusion. The calculated boron profiles based on the diffusivity model are confirmed to agree with the actual diffusion profiles measured by secondary ion mass spectroscopy (SIMS) for a wide range of the BSG boron concentration. This analytical diffusivity model is a helpful tool for p+ boron diffusion process optimization of n-type solar cell manufacturing.

  7. Analytical caustic surfaces

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1987-01-01

    This document discusses the determination of caustic surfaces in terms of rays, reflectors, and wavefronts. Analytical caustics are obtained as a family of lines, a set of points, and several types of equations for geometries encountered in optics and microwave applications. Standard methods of differential geometry are applied under different approaches: directly to reflector surfaces, and alternatively, to wavefronts, to obtain analytical caustics of two sheets or branches. Gauss/Seidel aberrations are introduced into the wavefront approach, forcing the retention of all three coefficients of both the first- and the second-fundamental forms of differential geometry. An existing method for obtaining caustic surfaces through exploitation of the singularities in flux density is examined, and several constant-intensity contour maps are developed using only the intrinsic Gaussian, mean, and normal curvatures of the reflector. Numerous references are provided for extending the material of the present document to the morphologies of caustics and their associated diffraction patterns.

  8. Requirements for Predictive Analytics

    SciTech Connect

    Troy Hiltbrand

    2012-03-01

    It is important to have a clear understanding of how traditional Business Intelligence (BI) and analytics are different and how they fit together in optimizing organizational decision making. With tradition BI, activities are focused primarily on providing context to enhance a known set of information through aggregation, data cleansing and delivery mechanisms. As these organizations mature their BI ecosystems, they achieve a clearer picture of the key performance indicators signaling the relative health of their operations. Organizations that embark on activities surrounding predictive analytics and data mining go beyond simply presenting the data in a manner that will allow decisions makers to have a complete context around the information. These organizations generate models based on known information and then apply other organizational data against these models to reveal unknown information.

  9. Nuclear analytical chemistry

    SciTech Connect

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  10. Analytic holographic superconductor

    NASA Astrophysics Data System (ADS)

    Herzog, Christopher P.

    2010-06-01

    We investigate a holographic superconductor that admits an analytic treatment near the phase transition. In the dual 3+1-dimensional field theory, the phase transition occurs when a scalar operator of scaling dimension two gets a vacuum expectation value. We calculate current-current correlation functions along with the speed of second sound near the critical temperature. We also make some remarks about critical exponents. An analytic treatment is possible because an underlying Heun equation describing the zero mode of the phase transition has a polynomial solution. Amusingly, the treatment here may generalize for an order parameter with any integer spin, and we propose a Lagrangian for a spin-two holographic superconductor.

  11. Avatars in Analytical Gaming

    SciTech Connect

    Cowell, Andrew J.; Cowell, Amanda K.

    2009-08-29

    This paper discusses the design and use of anthropomorphic computer characters as nonplayer characters (NPC’s) within analytical games. These new environments allow avatars to play a central role in supporting training and education goals instead of planning the supporting cast role. This new ‘science’ of gaming, driven by high-powered but inexpensive computers, dedicated graphics processors and realistic game engines, enables game developers to create learning and training opportunities on par with expensive real-world training scenarios. However, there needs to be care and attention placed on how avatars are represented and thus perceived. A taxonomy of non-verbal behavior is presented and its application to analytical gaming discussed.

  12. Industrial Analytics Corporation

    SciTech Connect

    Industrial Analytics Corporation

    2004-01-30

    The lost foam casting process is sensitive to the properties of the EPS patterns used for the casting operation. In this project Industrial Analytics Corporation (IAC) has developed a new low voltage x-ray instrument for x-ray radiography of very low mass EPS patterns. IAC has also developed a transmitted visible light method for characterizing the properties of EPS patterns. The systems developed are also applicable to other low density materials including graphite foams.

  13. Integrated Array/Metadata Analytics

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  14. Management Tools

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  15. Robot Tools

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Mecanotron, now division of Robotics and Automation Corporation, developed a quick-change welding method called the Automatic Robotics Tool-change System (ARTS) under Marshall Space Flight Center and Rockwell International contracts. The ARTS system has six tool positions ranging from coarse sanding disks and abrasive wheels to cloth polishing wheels with motors of various horsepower. The system is used by fabricators of plastic body parts for the auto industry, by Texas Instruments for making radar domes, and for advanced composites at Aerospatiale in France.

  16. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise. PMID:20929194

  17. Analytical Methods for Immunogenetic Population Data

    PubMed Central

    Mack, Steven J.; Gourraud, Pierre-Antoine; Single, Richard M.; Thomson, Glenys; Hollenbach, Jill A.

    2014-01-01

    In this chapter, we describe analyses commonly applied to immunogenetic population data, along with software tools that are currently available to perform those analyses. Where possible, we focus on tools that have been developed specifically for the analysis of highly polymorphic immunogenetic data. These analytical methods serve both as a means to examine the appropriateness of a dataset for testing a specific hypothesis, as well as a means of testing hypotheses. Rather than treat this chapter as a protocol for analyzing any population dataset, each researcher and analyst should first consider their data, the possible analyses, and any available tools in light of the hypothesis being tested. The extent to which the data and analyses are appropriate to each other should be determined before any analyses are performed. PMID:22665237

  18. Transcutaneous Measurement of Blood Analyte Concentration Using Raman Spectroscopy

    NASA Astrophysics Data System (ADS)

    Barman, Ishan; Singh, Gajendra P.; Dasari, Ramachandra R.; Feld, Michael S.

    2008-11-01

    Diabetes mellitus is a chronic disorder, affecting nearly 200 million people worldwide. Acute complications, such as hypoglycemia, cardiovascular disease and retinal damage, may occur if the disease is not adequately controlled. As diabetes has no known cure, tight control of glucose levels is critical for the prevention of such complications. Given the necessity for regular monitoring of blood glucose, development of non-invasive glucose detection devices is essential to improve the quality of life in diabetic patients. The commercially available glucose sensors measure the interstitial fluid glucose by electrochemical detection. However, these sensors have severe limitations, primarily related to their invasive nature and lack of stability. This necessitates the development of a truly non-invasive glucose detection technique. NIR Raman Spectroscopy, which combines the substantial penetration depth of NIR light with the excellent chemical specificity of Raman spectroscopy, provides an excellent tool to meet the challenges involved. Additionally, it enables simultaneous determination of multiple blood analytes. Our laboratory has pioneered the use of Raman spectroscopy for blood analytes' detection in biological media. The preliminary success of our non-invasive glucose measurements both in vitro (such as in serum and blood) and in vivo has provided the foundation for the development of feasible clinical systems. However, successful application of this technology still faces a few hurdles, highlighted by the problems of tissue luminescence and selection of appropriate reference concentration. In this article we explore possible avenues to overcome these challenges so that prospective prediction accuracy of blood analytes can be brought to clinically acceptable levels.

  19. Geometric reasoning about assembly tools

    SciTech Connect

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  20. Ternary complexes in analytical chemistry.

    PubMed

    Babko, A K

    1968-08-01

    Reactions between a complex AB and a third component C do not always proceed by a displacement mechanism governed by the energy difference of the chemical bonds A-B and A-C. The third component often becomes part of the complex, forming a mixed co-ordination sphere or ternary complex. The properties of this ternary complex ABC are not additive functions of the properties of AB and AC. Such reactions are important in many methods in analytical chemistry, particularly in photometric analysis, extractive separation, masking, etc. The general properties of the four basic types of ternary complex are reviewed and examples given. The four types comprise the systems (a) metal ion, electronegative ligand, organic base, (b) one metal ion, two different electronegative ligands, (c) ternary heteropoly acids, and (d) two different metal ions, one ligand. PMID:18960358

  1. Image Tool

    SciTech Connect

    Baker, S.A.; Gardner, S.D.; Rogers, M.L.; Sanders, F.; Tunnell, T.W.

    2001-01-01

    ImageTool is a software package developed at Bechtel Nevada, Los Alamos Operations. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data. Performance measures are used to identify capabilities and limitations of a camera system, while establishing a means for comparing systems. The camera evaluations are designed to provide system performance, camera comparison and system modeling information. This program is used to evaluate digital camera images. ImageTool provides basic image restoration and analysis features along with a special set of camera evaluation tools which are used to standardize camera system characterizations. This process is started with the acquisition of a well-defined set of calibration images. Image processing algorithms provide a consistent means of evaluating the camera calibration data. Performance measures in the areas of sensitivity, noise, and resolution are used as a basis for comparing camera systems and evaluating experimental system performance. Camera systems begin with a charge-coupled device (CCD) camera and optical relay system and may incorporate image intensifiers, electro-static image tubes, or electron bombarded charge-coupled devices (EBCCDs). Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera types evaluated include gated intensified cameras and multi-frame cameras used in applications ranging from X-ray radiography to visible and infrared imaging. It is valuable to evaluate the performance of a camera system in order to determine if a particular system meets experimental requirements. In this paper we highlight the processing features of ImageTool.

  2. An analytic model for the Phobos surface

    NASA Technical Reports Server (NTRS)

    Duxbury, Thomas C.

    1991-01-01

    Analytic expressions are derived to model the surface topography and the normal to the surface of Phobos. The analytic expressions are comprised of a spherical harmonic expansion for the global figure of Phobos, augmented by addition terms for the large crater Stickney and other craters. Over 300 craters were measured in more than 100 Viking Orbiter images to produce the model. In general, the largest craters were measured since they have a significant effect on topography. The topographic model derived has a global spatial and topographic accuracy ranging from about 100 m in areas having the highest resolution and convergent, stereo coverage, up to 500 m in the poorest areas.

  3. Analytic estimates of coupling in damping rings

    SciTech Connect

    Raubenheimer, T.O.; Ruth, R.D.

    1989-03-01

    In this paper we present analytic formulas to estimate the vertical emittance in weakly coupled electron/positron storage rings. We consider contributions from both the vertical dispersion and linear coupling of the betatron motions. In addition to simple expressions for random misalignments and rotations of the magnets, formulas are presented to calculate the vertical emittance blowup due to orbit distortions. The orbit distortions are assumed to be caused by random misalignments, but because the closed orbit is correlated from point to point, the effects must be treated differently. We consider only corrected orbits. Finally, the analytic expressions are compared with computer simulations of storage rings with random misalignments. 6 refs., 3 figs.

  4. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  5. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  6. Automation of analytical isotachophoresis

    NASA Technical Reports Server (NTRS)

    Thormann, Wolfgang

    1985-01-01

    The basic features of automation of analytical isotachophoresis (ITP) are reviewed. Experimental setups consisting of narrow bore tubes which are self-stabilized against thermal convection are considered. Sample detection in free solution is discussed, listing the detector systems presently used or expected to be of potential use in the near future. The combination of a universal detector measuring the evolution of ITP zone structures with detector systems specific to desired components is proposed as a concept of an automated chemical analyzer based on ITP. Possible miniaturization of such an instrument by means of microlithographic techniques is discussed.

  7. The GNEMRE Dendro Tool.

    SciTech Connect

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  8. CMS tracker visualization tools

    NASA Astrophysics Data System (ADS)

    Mennea, M. S.; Osborne, I.; Regano, A.; Zito, G.

    2005-08-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  9. Risk analytics for hedge funds

    NASA Astrophysics Data System (ADS)

    Pareek, Ankur

    2005-05-01

    The rapid growth of the hedge fund industry presents significant business opportunity for the institutional investors particularly in the form of portfolio diversification. To facilitate this, there is a need to develop a new set of risk analytics for investments consisting of hedge funds, with the ultimate aim to create transparency in risk measurement without compromising the proprietary investment strategies of hedge funds. As well documented in the literature, use of dynamic options like strategies by most of the hedge funds make their returns highly non-normal with fat tails and high kurtosis, thus rendering Value at Risk (VaR) and other mean-variance analysis methods unsuitable for hedge fund risk quantification. This paper looks at some unique concerns for hedge fund risk management and will particularly concentrate on two approaches from physical world to model the non-linearities and dynamic correlations in hedge fund portfolio returns: Self Organizing Criticality (SOC) and Random Matrix Theory (RMT).Random Matrix Theory analyzes correlation matrix between different hedge fund styles and filters random noise from genuine correlations arising from interactions within the system. As seen in the results of portfolio risk analysis, it leads to a better portfolio risk forecastability and thus to optimum allocation of resources to different hedge fund styles. The results also prove the efficacy of self-organized criticality and implied portfolio correlation as a tool for risk management and style selection for portfolios of hedge funds, being particularly effective during non-linear market crashes.

  10. Downhole tool

    DOEpatents

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  11. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research. PMID:25991109

  12. Analytical and Computational Aspects of Collaborative Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    Bilevel problem formulations have received considerable attention as an approach to multidisciplinary optimization in engineering. We examine the analytical and computational properties of one such approach, collaborative optimization. The resulting system-level optimization problems suffer from inherent computational difficulties due to the bilevel nature of the method. Most notably, it is impossible to characterize and hence identify solutions of the system-level problems because the standard first-order conditions for solutions of constrained optimization problems do not hold. The analytical features of the system-level problem make it difficult to apply conventional nonlinear programming algorithms. Simple examples illustrate the analysis and the algorithmic consequences for optimization methods. We conclude with additional observations on the practical implications of the analytical and computational properties of collaborative optimization.

  13. Learning Analytics: Readiness and Rewards

    ERIC Educational Resources Information Center

    Friesen, Norm

    2013-01-01

    This position paper introduces the relatively new field of learning analytics, first by considering the relevant meanings of both "learning" and "analytics," and then by looking at two main levels at which learning analytics can be or has been implemented in educational organizations. Although integrated turnkey systems or…

  14. The analytic renormalization group

    NASA Astrophysics Data System (ADS)

    Ferrari, Frank

    2016-08-01

    Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k ∈ Z, associated with the Matsubara frequencies νk = 2 πk / β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct "Analytic Renormalization Group" linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk | < μ (with the possible exception of the zero mode G0), together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk | ≥ μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  15. Infrared Spectroscopy as a Chemical Fingerprinting Tool

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    2003-01-01

    Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. Any sample material that will interact with infrared light produces a spectrum and, although normally associated with organic materials, inorganic compounds may also be infrared active. The technique is rapid, reproducible and usually non-invasive to the sample. That it is non-invasive allows for additional characterization of the original material using other analytical techniques including thermal analysis and RAMAN spectroscopic techniques. With the appropriate accessories, the technique can be used to examine samples in liquid, solid or gas phase. Both aqueous and non-aqueous free-flowing solutions can be analyzed, as can viscous liquids such as heavy oils and greases. Solid samples of varying sizes and shapes may also be examined and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be analyzed. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.

  16. Bolstering Teaching through Online Tools

    ERIC Educational Resources Information Center

    Singh, Anil; Mangalaraj, George; Taneja, Aakash

    2010-01-01

    This paper offers a compilation of technologies that provides either free or low-cost solutions to the challenges of teaching online courses. It presents various teaching methods the outlined tools and technologies can support, with emphasis on fit between these tools and the tasks they are meant to serve. In addition, it highlights various…

  17. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  18. Knowledge-analytics synergy in Clinical Decision Support.

    PubMed

    Slonim, Noam; Carmeli, Boaz; Goldsteen, Abigail; Keller, Oliver; Kent, Carmel; Rinott, Ruty

    2012-01-01

    Clinical Decision Support (CDS) systems hold tremendous potential for improving patient care. Most existing systems are knowledge-based tools that rely on relatively simple rules. More recent approaches rely on analytics techniques to automatically mine EHR data to reveal meaningful insights. Here, we propose the Knowledge-Analytics Synergy paradigm for CDS, in which we synergistically combine existing relevant knowledge with analytics applied to EHR data. We propose a framework for implementing such a paradigm and demonstrate its principles over real-world clinical and genomic data of hypertensive patients. PMID:22874282

  19. TACT: The Action Computation Tool

    NASA Astrophysics Data System (ADS)

    Sanders, Jason L.; Binney, James

    2015-12-01

    The Action Computation Tool (TACT) tests methods for estimating actions, angles and frequencies of orbits in both axisymmetric and triaxial potentials, including general spherical potentials, analytic potentials (Isochrone and Harmonic oscillator), axisymmetric Stackel fudge, average generating function from orbit (AvGF), and others. It is written in C++; code is provided to compile the routines into a Python library. TM (ascl:1512.014) and LAPACK are required to access some features.

  20. Applications of surface analytical techniques in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Qian, Gujie; Li, Yubiao; Gerson, Andrea R.

    2015-03-01

    This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences

  1. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects. PMID:24772784

  2. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. R.; St. Clair, T. L.; Burks, H. D.; Stoakley, D. M.

    1987-01-01

    A method has been found for enhancing the melt flow of thermoplastic polyimides during processing. A high molecular weight 422 copoly(amic acid) or copolyimide was fused with approximately 0.05 to 5 pct by weight of a low molecular weight amic acid or imide additive, and this melt was studied by capillary rheometry. Excellent flow and improved composite properties on graphite resulted from the addition of a PMDA-aniline additive to LARC-TPI. Solution viscosity studies imply that amic acid additives temporarily lower molecular weight and, hence, enlarge the processing window. Thus, compositions containing the additive have a lower melt viscosity for a longer time than those unmodified.

  3. VERDE Analytic Modules

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates servedmore » within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercial firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.« less

  4. VERDE Analytic Modules

    SciTech Connect

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates served within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercial firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.

  5. Normality in Analytical Psychology

    PubMed Central

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  6. Time-domain Raman analytical forward solvers.

    PubMed

    Martelli, Fabrizio; Binzoni, Tiziano; Sekar, Sanathana Konugolu Venkata; Farina, Andrea; Cavalieri, Stefano; Pifferi, Antonio

    2016-09-01

    A set of time-domain analytical forward solvers for Raman signals detected from homogeneous diffusive media is presented. The time-domain solvers have been developed for two geometries: the parallelepiped and the finite cylinder. The potential presence of a background fluorescence emission, contaminating the Raman signal, has also been taken into account. All the solvers have been obtained as solutions of the time dependent diffusion equation. The validation of the solvers has been performed by means of comparisons with the results of "gold standard" Monte Carlo simulations. These forward solvers provide an accurate tool to explore the information content encoded in the time-resolved Raman measurements. PMID:27607645

  7. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  8. Algal functional annotation tool

    SciTech Connect

    Lopez, D.; Casero, D.; Cokus, S. J.; Merchant, S. S.; Pellegrini, M.

    2012-07-01

    The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG pathway maps and batch gene identifier conversion.

  9. Analytical Chemistry Laboratory progress report for FY 1991

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Boparai, A.S.

    1991-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  10. Analytical Chemistry Laboratory: Progress report for FY 1988

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1988-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  11. Tool Gear: Infrastructure for Parallel Tools

    SciTech Connect

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  12. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> ; KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  13. COMPARISON OF RESPONSE OF 9977 TEST PACKAGES TO ANALYTICAL RESULTS

    SciTech Connect

    Smith, A; Tsu-Te Wu, T

    2007-12-05

    Each of the hypothetical accident test cases for the 9977 prototypes was included in the battery of finite element structural analyses performed for the package. Comparison of the experimental and analytical results provides a means of confirming that the analytical model correctly represents the physical behavior of the package. The ability of the analytical model to correctly predict the performance of the foam overpack material for the crush test is of particular interest. The dissipation of energy in the crushing process determines the deceleration of the package upon impact and the duration of the impact. In addition, if the analytical model correctly models the foam behavior, the predicted deformation of the package will match that measured on the test articles. This study compares the deformations of the test packages with the analytical predictions. In addition, the impact acceleration and impact duration for the test articles are compared with those predicted by the analyses.

  14. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  15. Non-commutative tools for topological insulators

    NASA Astrophysics Data System (ADS)

    Prodan, Emil

    2010-06-01

    This paper reviews several analytic tools for the field of topological insulators, developed with the aid of non-commutative calculus and geometry. The set of tools includes bulk topological invariants defined directly in the thermodynamic limit and in the presence of disorder, whose robustness is shown to have nontrivial physical consequences for the bulk states. The set of tools also includes a general relation between the current of an observable and its edge index, a relation that can be used to investigate the robustness of the edge states against disorder. The paper focuses on the motivations behind creating such tools and on how to use them.

  16. Additive usage levels.

    PubMed

    Langlais, R

    1996-01-01

    With the adoption of the European Parliament and Council Directives on sweeteners, colours and miscellaneous additives the Commission is now embarking on the project of coordinating the activities of the European Union Member States in the collection of the data that are to make up the report on food additive intake requested by the European Parliament. This presentation looks at the inventory of available sources on additive use levels and concludes that for the time being national legislation is still the best source of information considering that the directives have yet to be transposed into national legislation. Furthermore, this presentation covers the correlation of the food categories as found in the additives directives with those used by national consumption surveys and finds that in a number of instances this correlation still leaves a lot to be desired. The intake of additives via food ingestion and the intake of substances which are chemically identical to additives but which occur naturally in fruits and vegetables is found in a number of cases to be higher than the intake of additives added during the manufacture of foodstuffs. While the difficulties are recognized in contributing to the compilation of food additive intake data, industry as a whole, i.e. the food manufacturing and food additive manufacturing industries, are confident that in a concerted effort, use data on food additives by industry can be made available. Lastly, the paper points out that with the transportation of the additives directives into national legislation and the time by which the food industry will be able to make use of the new food legislative environment several years will still go by; food additives use data by the food industry will thus have to be reviewed at the beginning of the next century. PMID:8792135

  17. An additional middle cuneiform?

    PubMed Central

    Brookes-Fazakerley, S.D.; Jackson, G.E.; Platt, S.R.

    2015-01-01

    Additional cuneiform bones of the foot have been described in reference to the medial bipartite cuneiform or as small accessory ossicles. An additional middle cuneiform has not been previously documented. We present the case of a patient with an additional ossicle that has the appearance and location of an additional middle cuneiform. Recognizing such an anatomical anomaly is essential for ruling out second metatarsal base or middle cuneiform fractures and for the preoperative planning of arthrodesis or open reduction and internal fixation procedures in this anatomical location. PMID:26224890

  18. Analytic streamline calculations on linear tetrahedra

    SciTech Connect

    Diachin, D.P.; Herzog, J.A.

    1997-06-01

    Analytic solutions for streamlines within tetrahedra are used to define operators that accurately and efficiently compute streamlines. The method presented here is based on linear interpolation, and therefore produces exact results for linear velocity fields. In addition, the method requires less computation than the forward Euler numerical method. Results are presented that compare accuracy measurements of the method with forward Euler and fourth order Runge-Kutta applied to both a linear and a nonlinear velocity field.

  19. Analytic sequential methods for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Walker, Ernest

    2014-05-01

    In this paper, we propose an analytic sequential methods for detecting port-scan attackers which routinely perform random "portscans" of IP addresses to find vulnerable servers to compromise. In addition to rigorously control the probability of falsely implicating benign remote hosts as malicious, our method performs significantly faster than other current solutions. We have developed explicit formulae for quick determination of the parameters of the new detection algorithm.

  20. Mental health screening tools in correctional institutions: a systematic review

    PubMed Central

    2013-01-01

    Background Past studies have identified poor rates of detection of mental illness among inmates. Consequently, mental health screening is a common feature to various correctional mental health strategies and best practice guidelines. However, there is little guidance to support the selection of an appropriate tool. This systematic review compared the sensitivity and specificity of mental health screening tools among adult jail or prison populations. Methods A systematic review of MEDLINE and PsycINFO up to 2011, with additional studies identified from a search of reference lists. Only studies involving adult jail or prison populations, with an independent measure of mental illness, were included. Studies in forensic settings to determine fitness to stand trial or criminal responsibility were excluded. Twenty-four studies met all inclusion and exclusion criteria for the review. All articles were coded by two independent authors. Study quality was coded by the lead author. Results Twenty-two screening tools were identified. Only six tools have replication studies: the Brief Jail Mental Health Screen (BJMHS), the Correctional Mental Health Screen for Men (CMHS-M), the Correctional Mental Health Screen for Women (CMHS-W), the England Mental Health Screen (EMHS), the Jail Screening Assessment Tool (JSAT), and the Referral Decision Scale (RDS). A descriptive summary is provided in lieu of use of meta-analytic techniques due to the lack of replication studies and methodological variations across studies. Conclusions The BJMHS, CMHS-M, CMHS-W, EMHS and JSAT appear to be the most promising tools. Future research should consider important contextual factors in the implementation of a screening tool that have received little attention. Randomized or quasi-randomized trials are recommended to evaluate the effectiveness of screening to improve the detection of mental illness compared to standard practices. PMID:24168162

  1. Green tools

    NASA Astrophysics Data System (ADS)

    With an eye toward forging tools that the nonscientist can use to make environmentally prudent policy, the National Science Foundation has provided the seed funding to establish a new National Center for Environmental Decision-Making Research. NSF has awarded $5 million over the next five years to the Joint Institute for Energy and the Environment at the University of Tennessee for creation of the center. The organizing principle of the effort, according to NSF, is to "make scientific environmental research more relevant and useful to decision makers." Interdisciplinary teams of sociologists, economists, geologists, ecologists, computer scientists, psychologists, urban planners, and others will be asked to interpret existing research and to conduct new studies of environmental problems and how they were resolved.

  2. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  3. Mechanical properties of additively manufactured octagonal honeycombs.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-12-01

    Honeycomb structures have found numerous applications as structural and biomedical materials due to their favourable properties such as low weight, high stiffness, and porosity. Application of additive manufacturing and 3D printing techniques allows for manufacturing of honeycombs with arbitrary shape and wall thickness, opening the way for optimizing the mechanical and physical properties for specific applications. In this study, the mechanical properties of honeycomb structures with a new geometry, called octagonal honeycomb, were investigated using analytical, numerical, and experimental approaches. An additive manufacturing technique, namely fused deposition modelling, was used to fabricate the honeycomb from polylactic acid (PLA). The honeycombs structures were then mechanically tested under compression and the mechanical properties of the structures were determined. In addition, the Euler-Bernoulli and Timoshenko beam theories were used for deriving analytical relationships for elastic modulus, yield stress, Poisson's ratio, and buckling stress of this new design of honeycomb structures. Finite element models were also created to analyse the mechanical behaviour of the honeycombs computationally. The analytical solutions obtained using Timoshenko beam theory were close to computational results in terms of elastic modulus, Poisson's ratio and yield stress, especially for relative densities smaller than 25%. The analytical solutions based on the Timoshenko analytical solution and the computational results were in good agreement with experimental observations. Finally, the elastic properties of the proposed honeycomb structure were compared to those of other honeycomb structures such as square, triangular, hexagonal, mixed, diamond, and Kagome. The octagonal honeycomb showed yield stress and elastic modulus values very close to those of regular hexagonal honeycombs and lower than the other considered honeycombs. PMID:27612831

  4. Carbamate deposit control additives

    SciTech Connect

    Honnen, L.R.; Lewis, R.A.

    1980-11-25

    Deposit control additives for internal combustion engines are provided which maintain cleanliness of intake systems without contributing to combustion chamber deposits. The additives are poly(oxyalkylene) carbamates comprising a hydrocarbyloxyterminated poly(Oxyalkylene) chain of 2-5 carbon oxyalkylene units bonded through an oxycarbonyl group to a nitrogen atom of ethylenediamine.

  5. Hanford transuranic analytical capability

    SciTech Connect

    McVey, C.B.

    1995-02-24

    With the current DOE focus on ER/WM programs, an increase in the quantity of waste samples that requires detailed analysis is forecasted. One of the prime areas of growth is the demand for DOE environmental protocol analyses of TRU waste samples. Currently there is no laboratory capacity to support analysis of TRU waste samples in excess of 200 nCi/gm. This study recommends that an interim solution be undertaken to provide these services. By adding two glove boxes in room 11A of 222S the interim waste analytical needs can be met for a period of four to five years or until a front end facility is erected at or near the 222-S facility. The yearly average of samples is projected to be approximately 600 samples. The figure has changed significantly due to budget changes and has been downgraded from 10,000 samples to the 600 level. Until these budget and sample projection changes become firmer, a long term option is not recommended at this time. A revision to this document is recommended by March 1996 to review the long term option and sample projections.

  6. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  7. Design and Implementation of a Learning Analytics Toolkit for Teachers

    ERIC Educational Resources Information Center

    Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik

    2012-01-01

    Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…

  8. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  9. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  10. Transforming labor-management practices through real-time analytics.

    PubMed

    Nippert, Kathye Habig; Graves, Brian

    2012-06-01

    Catholic Health Partners (CHP) decentralized productivity management, giving its regional executives greater control over their productivity tools and data. CHP retained centralized management of its benchmarking and analytics and created an enterprise database with standardized information. CHP's stakeholders shared accountability and accepted greater responsibility for labor-management decisions. PMID:22734327

  11. Swift Science Analysis Tools

    NASA Astrophysics Data System (ADS)

    Marshall, F. E.; Swift Team Team

    2003-05-01

    Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (http://swiftsc.gsfc.nasa.gov), and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

  12. Making advanced analytics work for you.

    PubMed

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect. PMID:23074867

  13. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  14. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  15. Unexpected Analyte Oxidation during Desorption Electrospray Ionization - Mass Spectrometry

    SciTech Connect

    Pasilis, Sofie P; Kertesz, Vilmos; Van Berkel, Gary J

    2008-01-01

    During the analysis of surface spotted analytes using desorption electrospray ionization mass spectrometry (DESI-MS), abundant ions are sometimes observed that appear to be the result of oxygen addition reactions. In this investigation, the effect of sample aging, the ambient lab environment, spray voltage, analyte surface concentration, and surface type on this oxidative modification of spotted analytes, exemplified by tamoxifen and reserpine, during analysis by desorption electrospray ionization mass spectrometry was studied. Simple exposure of the samples to air and to ambient lighting increased the extent of oxidation. Increased spray voltage lead also to increased analyte oxidation, possibly as a result of oxidative species formed electrochemically at the emitter electrode or in the gas - phase by discharge processes. These oxidative species are carried by the spray and impinge on and react with the sampled analyte during desorption/ionization. The relative abundance of oxidized species was more significant for analysis of deposited analyte having a relatively low surface concentration. Increasing spray solvent flow rate and addition of hydroquinone as a redox buffer to the spray solvent were found to decrease, but not entirely eliminate, analyte oxidation during analysis. The major parameters that both minimize and maximize analyte oxidation were identified and DESI-MS operational recommendations to avoid these unwanted reactions are suggested.

  16. Analytical model of internally coupled ears.

    PubMed

    Vossen, Christine; Christensen-Dalsgaard, Jakob; van Hemmen, J Leo

    2010-08-01

    Lizards and many birds possess a specialized hearing mechanism: internally coupled ears where the tympanic membranes connect through a large mouth cavity so that the vibrations of the tympanic membranes influence each other. This coupling enhances the phase differences and creates amplitude differences in the tympanic membrane vibrations. Both cues show strong directionality. The work presented herein sets out the derivation of a three dimensional analytical model of internally coupled ears that allows for calculation of a complete vibration profile of the membranes. The analytical model additionally provides the opportunity to incorporate the effect of the asymmetrically attached columella, which leads to the activation of higher membrane vibration modes. Incorporating this effect, the analytical model can explain measurements taken from the tympanic membrane of a living lizard, for example, data demonstrating an asymmetrical spatial pattern of membrane vibration. As the analytical calculations show, the internally coupled ears increase the directional response, appearing in large directional internal amplitude differences (iAD) and in large internal time differences (iTD). Numerical simulations of the eigenfunctions in an exemplary, realistically reconstructed mouth cavity further estimate the effects of its complex geometry. PMID:20707461

  17. Analytical aspects of the CEEM soil project.

    PubMed

    Muntau, H; Rehnert, A; Desaules, A; Wagner, G; Theocharopoulos, S; Quevauviller, P

    2001-01-01

    In the past, exercises aiming at an assessment of data uncertainty in environmental analysis were usually restricted to the analysis step, while sampling and pre-analytical sample treatment was largely ignored. Collaborative studies on the quantification of sampling errors require, besides a suitable and well characterized test site, the availability of a reference laboratory for the analysis of all of the samples taken in the context of the study by all participants and also test methods which do not contribute large and variable uncertainties due to long and complex analytical methodologies. Here we summarize the major analytical aspects of a European project on the identification and quantification of sampling influences on the determination of lead, cadmium, copper and zinc in soil. The participant group included the leading soil analysis laboratories in Europe; the test site at Dornach (CH) was well suited for the purpose and showed high metal gradients and differentiated land use. The analytical methods (wavelength-dispersive X-ray fluorescence spectrometry and solid-state Zeeman AAS) used in the study showed stable performance characteristics within the confidence interval of the certified reference materials used for the measurement quality control over the entire project period. Additionally, double-blind tests on split samples showed agreement of data in very narrow limits thus demonstrating the reliability of the reference database. PMID:11213186

  18. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. PMID:26690047

  19. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  20. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  1. Additional Types of Neuropathy

    MedlinePlus

    ... A A Listen En Español Additional Types of Neuropathy Charcot's Joint Charcot's Joint, also called neuropathic arthropathy, ... can stop bone destruction and aid healing. Cranial Neuropathy Cranial neuropathy affects the 12 pairs of nerves ...

  2. Thermoelectric conversion of heat fluxes: analytical and experimental approach

    NASA Astrophysics Data System (ADS)

    Amokrane, Mounir; Nogarede, Bertrand

    2012-08-01

    When considering electric energy harvesting from waste heat, two different solutions of direct conversion are possible: pyroelectric and thermoelectric conversions. This paper presents a study of the thermoelectric conversion by two different approaches: analytical and experimental. Furthermore, a brief historical description of the discovery and early years of development of thermoelectricity is presented. The essential objective of this work is to develop a numerical tool that can estimate the output quantities of a thermoelectric converter, without knowing all its features. For this, two analytical models were developed, based on electrical and thermal phenomena occurring within the active element. The results obtained by this model were compared successfully with experiments carried out on an industrial thermoelectric element. Considering the centimetric size of the device (16 cm2 area), the electrical power recovered by this conversion varies from 16 to 80 mW for a temperature difference between 2 and 18 °C and according to the load value. In addition, both models transcribe the behavior of the active element with an accuracy of about 10%. In agreement with this, the output voltages reached are of the same magnitude for the models and the experimental values and vary from 0.1 to 0.8 V depending on the load connected and the type of convection. Another issue which is discussed for the two cases is that an optimal recovered energy is obtained for a given electric load taking into account the physical characteristics of the considered thermoelectric element. Finally, a conversion efficiency calculation has shown that it is possible to reach 45% of the Carnot efficiency. This denotes the interest to perform load matching to optimize the output power.

  3. Infrared Spectroscopy as a Chemical Fingerprinting Tool

    NASA Technical Reports Server (NTRS)

    Huff, Tim; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. The technique is rapid, reproducible and usually non-invasive. With the appropriate accessories, the technique can be used to examine samples in either a solid, liquid or gas phase. Solid samples of varying sizes and shapes may be used, and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be examined. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Both aqueous and non-aqueous free-flowing solutions can be analyzed using appropriate IR techniques, as can viscous liquids such as heavy oils and greases. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.

  4. Tools for Authentication

    SciTech Connect

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  5. Tools for the study of dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Zhang, Fan

    This thesis covers a range of topics in numerical and analytical relativity, centered around introducing tools and methodologies for the study of dynamical spacetimes. The scope of the studies is limited to classical (as opposed to quantum) vacuum spacetimes described by Einstein's general theory of relativity. The numerical works presented here are carried out within the Spectral Einstein Code (SpEC) infrastructure, while analytical calculations extensively utilize Wolfram's Mathematica program. We begin by examining highly dynamical spacetimes such as binary black hole mergers, which can be investigated using numerical simulations. However, there are difficulties in interpreting the output of such simulations. One difficulty stems from the lack of a canonical coordinate system (henceforth referred to as gauge freedom) and tetrad, against which quantities such as Newman-Penrose Psi4 (usually interpreted as the gravitational wave part of curvature) should be measured. We tackle this problem in Chapter 2 by introducing a set of geometrically motivated coordinates that are independent of the simulation gauge choice, as well as a quasi-Kinnersley tetrad, also invariant under gauge changes in addition to being optimally suited to the task of gravitational wave extraction. Another difficulty arises from the need to condense the overwhelming amount of data generated by the numerical simulations. In order to extract physical information in a succinct and transparent manner, one may define a version of gravitational field lines and field strength using spatial projections of the Weyl curvature tensor. Introduction, investigation and utilization of these quantities will constitute the main content in Chapters 3 through 6. For the last two chapters, we turn to the analytical study of a simpler dynamical spacetime, namely a perturbed Kerr black hole. We will introduce in Chapter 7 a new analytical approximation to the quasi-normal mode (QNM) frequencies, and relate various

  6. The Case for Assessment Analytics

    ERIC Educational Resources Information Center

    Ellis, Cath

    2013-01-01

    Learning analytics is a relatively new field of inquiry and its precise meaning is both contested and fluid (Johnson, Smith, Willis, Levine & Haywood, 2011; LAK, n.d.). Ferguson (2012) suggests that the best working definition is that offered by the first Learning Analytics and Knowledge (LAK) conference: "the measurement, collection,…

  7. Analytic Model For Estimation Of Cold Bulk Metal Forming Simulations

    SciTech Connect

    Skunca, Marko; Keran, Zdenka; Math, Miljenko

    2007-05-17

    Numerical simulation of bulk metal forming plays an important role in predicting a key parameters in cold forging. Comparison of numerical and experimental data is of great importance, but there is always a need of more universal analytical tools. Therefore, many papers besides experiment and simulation of a particular bulk metal forming technology, include an analytic model. In this paper an analytical model for evaluation of commercially available simulation program packages is proposed. Based on elementary theory of plasticity, being only geometry dependent, model represents a good analytical reference to estimate given modeling preferences like; element types, solver, remeshing influence and many others. Obtained, geometry dependent, stress fields compared with numerical data give a clear picture of numerical possibilities and limitations of particular modeling program package.

  8. Characterization of stainless steel assisted bare gold nanoparticles and their analytical potential.

    PubMed

    López-Lorente, A I; Simonet, B M; Valcárcel, M; Eppler, S; Schindl, R; Kranz, C; Mizaikoff, B

    2014-01-01

    A simple, environmentally friendly, one-pot method to synthesize highly stable bare gold nanoparticles (AuNPs) has been developed. AuNPs have been synthesized from tetrachloroauric acid solution using steel or stainless steel as solid reducing agent, which can be reused. The proposed method yields bare gold nanoparticles at atmospheric pressure and room temperature for potentially producing large quantities. The obtained AuNPs have been characterized by SEM, TEM and AFM finding an average diameter of around 20 nm, polygonal yet nearly spherical shape and a narrow size distribution. The mechanism of reaction has been investigated by UV-vis spectroscopy, ICP-OES and EDX analysis. The obtained dispersed gold nanoparticles proved to be stable if stored a 4 °C for over four months without the addition of a stabilizing agent. Their analytical potential as SERS substrate has been demonstrated and their performance compared with that showed by citrate-coated gold nanoparticles. Thanks to their unique properties, their use as analytical tools provides analytical processes with enhanced selectivity and precision. PMID:24274303

  9. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  10. A framework for selecting analytical techniques in profiling authentic and counterfeit Viagra and Cialis.

    PubMed

    Anzanello, Michel J; Ortiz, Rafael S; Limberger, Renata; Mariotti, Kristiane

    2014-02-01

    Several analytical techniques aimed at profiling drugs are deemed costly and time consuming, and may not be promptly available for analysis when required. This paper proposes a method for identifying the analytical techniques providing the most relevant data for classification of drug samples into authentic and unauthentic categories. For that matter, we integrate principal components analysis (PCA) to k-Nearest Neighbor (KNN) and Support Vector Machine (SVM) classification tools. PCA is first applied to data from five techniques, i.e., physical profile, X-ray fluorescence (XRF), direct infusion electrospray ionization mass spectrometry (ESI-MS), active pharmacological ingredients profile (ultra performance liquid chromatography, UPLC-MS), and infrared spectroscopic profile (ATR-FTIR). Subsets of PCA scores are then combined with a "leave one subset out at a time" approach, and the classification accuracy using KNN and SVM evaluated after each subset is omitted. Subsets yielding the maximum accuracy indicate the techniques to be prioritized in profiling applications. When applied to data from Viagra and Cialis, the proposed method recommended using the data from UPLC-MS, physical profile and ATR-FTIR techniques, which increased the categorization accuracy. In addition, the SVM classification tool is suggested as more accurate when compared to the KNN. PMID:24447444

  11. Visual analytics for multimodal social network analysis: a design study with social scientists.

    PubMed

    Ghani, Sohaib; Kwon, Bum Chul; Lee, Seungyoon; Yi, Ji Soo; Elmqvist, Niklas

    2013-12-01

    Social network analysis (SNA) is becoming increasingly concerned not only with actors and their relations, but also with distinguishing between different types of such entities. For example, social scientists may want to investigate asymmetric relations in organizations with strict chains of command, or incorporate non-actors such as conferences and projects when analyzing coauthorship patterns. Multimodal social networks are those where actors and relations belong to different types, or modes, and multimodal social network analysis (mSNA) is accordingly SNA for such networks. In this paper, we present a design study that we conducted with several social scientist collaborators on how to support mSNA using visual analytics tools. Based on an openended, formative design process, we devised a visual representation called parallel node-link bands (PNLBs) that splits modes into separate bands and renders connections between adjacent ones, similar to the list view in Jigsaw. We then used the tool in a qualitative evaluation involving five social scientists whose feedback informed a second design phase that incorporated additional network metrics. Finally, we conducted a second qualitative evaluation with our social scientist collaborators that provided further insights on the utility of the PNLBs representation and the potential of visual analytics for mSNA. PMID:24051769

  12. ASSESS (Analytic System and Software for Evaluating Safeguards and Security) update: Current status and future developments

    SciTech Connect

    Al-Ayat, R.A. ); Cousins, T.D. ); Hoover, E.R. )

    1990-07-15

    The Analytic System and Software for Evaluating Safeguards and Security (ASSESS) has been released for use by DOE field offices and their contractors. In October, 1989, we offered a prototype workshop to selected representatives of the DOE community. Based on the prototype results, we held the first training workshop at the Central Training Academy in January, 1990. Four additional workshops are scheduled for FY 1990. ASSESS is a state-of-the-art analytical tool for management to conduct integrated evaluation of safeguards systems at facilities handling facilities. Currently, ASSESS focuses on the threat of theft/diversion of special nuclear material by insiders, outsiders, and a special form of insider/outsider collusion. ASSESS also includes a neutralization module. Development of the tool is continuing. Plans are underway to expand the capabilities of ASSESS to evaluate against violent insiders, to validate the databases, to expand the neutralization module, and to assist in demonstrating compliance with DOE Material Control and Accountability (MC A) Order 5633.3. These new capabilities include the ability to: compute a weighted average for performance capability against a spectrum of insider adversaries; conduct defense-in-depth analyses; and analyze against protracted theft scenarios. As they become available, these capabilities will be incorporated in our training program. ASSESS is being developed jointly by Lawrence Livermore and Sandia National Laboratories under the sponsorship of the Department of Energy (DOE) Office of Safeguards and Security.

  13. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  14. Tool Wear in Friction Drilling

    SciTech Connect

    Miller, Scott F; Blau, Peter Julian; Shih, Albert J.

    2007-01-01

    This study investigated the wear of carbide tools used in friction drilling, a nontraditional hole-making process. In friction drilling, a rotating conical tool uses the heat generated by friction to soften and penetrate a thin workpiece and create a bushing without generating chips. The wear of a hard tungsten carbide tool used for friction drilling a low carbon steel workpiece has been investigated. Tool wear characteristics were studied by measuring its weight change, detecting changes in its shape with a coordinate measuring machine, and making observations of wear damage using scanning electron microscopy. Energy dispersive spectroscopy was applied to analyze the change in chemical composition of the tool surface due to drilling. In addition, the thrust force and torque during drilling and the hole size were measured periodically to monitor the effects of tool wear. Results indicate that the carbide tool is durable, showing minimal tool wear after drilling 11000 holes, but observations also indicate progressively severe abrasive grooving on the tool tip.

  15. Avionics System Architecture Tool

    NASA Technical Reports Server (NTRS)

    Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian

    2005-01-01

    Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.

  16. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  17. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  18. Phenylethynyl Containing Reactive Additives

    NASA Technical Reports Server (NTRS)

    Connell, John W. (Inventor); Smith, Joseph G., Jr. (Inventor); Hergenrother, Paul M. (Inventor)

    2002-01-01

    Phenylethynyl containing reactive additives were prepared from aromatic diamine, containing phenylethvnvl groups and various ratios of phthalic anhydride and 4-phenylethynviphthalic anhydride in glacial acetic acid to form the imide in one step or in N-methyl-2-pvrrolidinone to form the amide acid intermediate. The reactive additives were mixed in various amounts (10% to 90%) with oligomers containing either terminal or pendent phenylethynyl groups (or both) to reduce the melt viscosity and thereby enhance processability. Upon thermal cure, the additives react and become chemically incorporated into the matrix and effect an increase in crosslink density relative to that of the host resin. This resultant increase in crosslink density has advantageous consequences on the cured resin properties such as higher glass transition temperature and higher modulus as compared to that of the host resin.

  19. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    SciTech Connect

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.; Riensche, Roderick M.; Franklin, Lyndsey; Pike, William A.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analytical components from information sources making it easier to adapt the framework for many different data repositories.

  20. Additives in plastics.

    PubMed Central

    Deanin, R D

    1975-01-01

    The polymers used in plastics are generally harmless. However, they are rarely used in pure form. In almost all commercial plastics, they are "compounded" with monomeric ingredients to improve their processing and end-use performance. In order of total volume used, these monomeric additives may be classified as follows: reinforcing fibers, fillers, and coupling agents; plasticizers; colorants; stabilizers (halogen stabilizers, antioxidants, ultraviolet absorbers, and biological preservatives); processing aids (lubricants, others, and flow controls); flame retardants, peroxides; and antistats. Some information is already available, and much more is needed, on potential toxicity and safe handling of these additives during processing and manufacture of plastics products. PMID:1175566

  1. CoNNECT: Data Analytics for Energy Efficient Communities

    SciTech Connect

    Omitaomu, Olufemi A; Bhaduri, Budhendra L; Kodysh, Jeffrey B

    2012-01-01

    Energy efficiency is the lowest cost option being promoted for achieving a sustainable energy policy. Thus, there have been some innovations to reduce residential and commercial energy usage. There have also been calls to the utility companies to give customers access to timely, useful, and actionable information about their energy use, in order to unleash additional innovations in homes and businesses. Hence, some web-based tools have been developed for the public to access and compare energy usage data. In order to advance on these efforts, we propose a data analytics framework called Citizen Engagement for Energy Efficient Communities (CoNNECT). On the one hand, CoNNECT will help households to understand (i) the patterns in their energy consumption over time and how those patterns correlate with weather data, (ii) how their monthly consumption compares to other households living in houses of similar size and age within the same geographic areas, and (iii) what other customers are doing to reduce their energy consumption. We hope that the availability of such data and analysis to the public will facilitate energy efficiency efforts in residential buildings. These capabilities formed the public portal of the CoNNECT framework. On the other hand, CoNNECT will help the utility companies to better understand their customers by making available to the utilities additional datasets that they naturally do not have access to, which could help them develop focused services for their customers. These additional capabilities are parts of the utility portal of the CoNNECT framework. In this paper, we describe the CoNNECT framework, the sources of the data used in its development, the functionalities of both the public and utility portals, and the application of empirical mode decomposition for decomposing usage signals into mode functions with the hope that such mode functions could help in clustering customers into unique groups and in developing guidelines for energy

  2. Health informatics and analytics - building a program to integrate business analytics across clinical and administrative disciplines.

    PubMed

    Tremblay, Monica Chiarini; Deckard, Gloria J; Klein, Richard

    2016-07-01

    Health care organizations must develop integrated health information systems to respond to the numerous government mandates driving the movement toward reimbursement models emphasizing value-based and accountable care. Success in this transition requires integrated data analytics, supported by the combination of health informatics, interoperability, business process design, and advanced decision support tools. This case study presents the development of a master's level cross- and multidisciplinary informatics program offered through a business school. The program provides students from diverse backgrounds with the knowledge, leadership, and practical application skills of health informatics, information systems, and data analytics that bridge the interests of clinical and nonclinical professionals. This case presents the actions taken and challenges encountered in navigating intra-university politics, specifying curriculum, recruiting the requisite interdisciplinary faculty, innovating the educational format, managing students with diverse educational and professional backgrounds, and balancing multiple accreditation agencies. PMID:27274022

  3. THE FUTURE OF SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): 2006-2010

    EPA Science Inventory

    SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...

  4. Analytical Chemistry Laboratory progress report for FY 1985

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.

  5. Design intent tool: User guide

    SciTech Connect

    Mills, Evan; Abell, Daniel; Bell, Geoffrey; Faludi, Jeremy; Greenberg, Steve; Hitchcock, Rob; Piette, Mary Ann; Sartor, Dalei; Stum, Karl

    2002-08-23

    This database tool provides a structured approach to recording design decisions that impact a facility's design intent in areas such as energy efficiency.Owners and de signers alike can plan, monitor and verify that a facility's design intent is being met during each stage of the design process. Additionally, the Tool gives commissioning agents, facility operators and future owners and renovators an understanding of how the building and its subsystems are intended to operate, and thus track and benchmark performance.

  6. Biobased lubricant additives

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Fully biobased lubricants are those formulated using all biobased ingredients, i.e. biobased base oils and biobased additives. Such formulations provide the maximum environmental, safety, and economic benefits expected from a biobased product. Currently, there are a number of biobased base oils that...

  7. Multifunctional fuel additives

    SciTech Connect

    Baillargeon, D.J.; Cardis, A.B.; Heck, D.B.

    1991-03-26

    This paper discusses a composition comprising a major amount of a liquid hydrocarbyl fuel and a minor low-temperature flow properties improving amount of an additive product of the reaction of a suitable diol and product of a benzophenone tetracarboxylic dianhydride and a long-chain hydrocarbyl aminoalcohol.

  8. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  9. Coherent pulsed excitation of degenerate multistate systems: Exact analytic solutions

    SciTech Connect

    Kyoseva, E. S.; Vitanov, N. V.

    2006-02-15

    We show that the solution of a multistate system composed of N degenerate lower (ground) states and one upper (excited) state can be reduced by using the Morris-Shore transformation to the solution of a two-state system involving only the excited state and a (bright) superposition of ground states. In addition, there are N-1 dark states composed of ground states. We use this decomposition to derive analytical solutions for degenerate extensions of the most popular exactly soluble models: the resonance solution, the Rabi, Landau-Zener, Rosen-Zener, Allen-Eberly, and Demkov-Kunike models. We suggest various applications of the multistate solutions, for example, as tools for creating multistate coherent superpositions by generalized resonant {pi} pulses. We show that such generalized {pi} pulses can occur even when the upper state is far off resonance, at specific detunings, which makes it possible to operate in the degenerate ground-state manifold without populating the (possibly lossy) upper state, even transiently.

  10. Process Mapping: Tools, Techniques, & Critical Success Factors.

    ERIC Educational Resources Information Center

    Kalman, Howard K.

    2002-01-01

    Explains process mapping as an analytical tool and a process intervention that performance technologists can use to improve human performance by reducing error variance. Highlights include benefits of process mapping; and critical success factors, including organizational readiness, time commitment by participants, and the availability of a…

  11. A Tool To Assess Journal Price Discrimination.

    ERIC Educational Resources Information Center

    Meyer, Richard W.

    2001-01-01

    The author designed an experiment to determine whether periodical price inflation might be dampened by electronic scholarship. This article discusses results of an econometric analysis of prices for 859 periodical titles for three consecutive years, and concludes with a description of an analytical tool that may be used to assess journal prices.…

  12. Airport vulnerability assessment: an analytical approach

    NASA Astrophysics Data System (ADS)

    Lazarick, Richard T.

    1998-12-01

    The Airport Vulnerability Assessment Project (AVAP) is the direct result of congressional funding of recommendation 3.13 of the White House Commission on Aviation Safety and Security. This project takes a new approach to the assessment of U.S. commercial airports. AVAP uses automation, analytical methods and tools to evaluate vulnerability and risk, and to analyze cost/benefits in a more quantitative manner. This paper addresses both the process used to conduct this program, as well as a generalized look at the results, which have been achieved for the initial airport assessments. The process description covers the acquisition approach, the project structure, and a review of the various methodologies and tools being used by the sever performing organizations (Abacus Technology, Battelle, CTI, Lockwood Greene, Naval Facilities Engineering Service Center, SAIC, and Science & Engineering Associates). The tools described include ASSESS, SAM, RiskWatch, CASRAP, and AVAT. Included in the process is the utilization of an advisory panel made up predominantly of experts from the National Laboratories 9Sandia, Oak Ridge, Argonne and Brookhaven). The results portion addresses the findings and products resulting from the initial airport assessments. High level (unrestricted) summaries of the results are presented, along with initial trends in commonly recommended security improvements (countermeasures). Opportunities for the application of optics technology are identified.

  13. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  14. Sharpening the health policy analytical rapier Comment on "The politics and analytics of health policy"

    PubMed Central

    Powell, Martin

    2014-01-01

    This commentary on the Editorial ‘The politics and analytics of health policy’ by Professor Calum Paton focuses on two issues. First, it points to the unclear links between ideas, ideology, values, and discourse and policy, and warns that discourse is often a poor guide to enacted policy. Second, it suggests that realism, particularly ‘programme theory’ are useful tools for health policy analysis. ‘Market reform’ cannot be reduced to a simple ‘four legs good, two legs bad’ verdict, and programme theory might suggest that certain mechanisms may be good for one outcome in a particular context, but bad for another. PMID:24847488

  15. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  16. Boron addition to alloys

    SciTech Connect

    Coad, B. C.

    1985-08-20

    A process for addition of boron to an alloy which involves forming a melt of the alloy and a reactive metal, selected from the group consisting of aluminum, titanium, zirconium and mixtures thereof to the melt, maintaining the resulting reactive mixture in the molten state and reacting the boric oxide with the reactive metal to convert at least a portion of the boric oxide to boron which dissolves in the resulting melt, and to convert at least portion of the reactive metal to the reactive metal oxide, which oxide remains with the resulting melt, and pouring the resulting melt into a gas stream to form a first atomized powder which is subsequently remelted with further addition of boric oxide, re-atomized, and thus reprocessed to convert essentially all the reactive metal to metal oxide to produce a powdered alloy containing specified amounts of boron.

  17. Tackifier for addition polyimides

    NASA Technical Reports Server (NTRS)

    Butler, J. M.; St.clair, T. L.

    1980-01-01

    A modification to the addition polyimide, LaRC-160, was prepared to improve tack and drape and increase prepeg out-time. The essentially solventless, high viscosity laminating resin is synthesized from low cost liquid monomers. The modified version takes advantage of a reactive, liquid plasticizer which is used in place of solvent and helps solve a major problem of maintaining good prepeg tack and drape, or the ability of the prepeg to adhere to adjacent plies and conform to a desired shape during the lay up process. This alternate solventless approach allows both longer life of the polymer prepeg and the processing of low void laminates. This approach appears to be applicable to all addition polyimide systems.

  18. Vinyl capped addition polyimides

    NASA Technical Reports Server (NTRS)

    Vannucci, Raymond D. (Inventor); Malarik, Diane C. (Inventor); Delvigs, Peter (Inventor)

    1991-01-01

    Polyimide resins (PMR) are generally useful where high strength and temperature capabilities are required (at temperatures up to about 700 F). Polyimide resins are particularly useful in applications such as jet engine compressor components, for example, blades, vanes, air seals, air splitters, and engine casing parts. Aromatic vinyl capped addition polyimides are obtained by reacting a diamine, an ester of tetracarboxylic acid, and an aromatic vinyl compound. Low void materials with improved oxidative stability when exposed to 700 F air may be fabricated as fiber reinforced high molecular weight capped polyimide composites. The aromatic vinyl capped polyimides are provided with a more aromatic nature and are more thermally stable than highly aliphatic, norbornenyl-type end-capped polyimides employed in PMR resins. The substitution of aromatic vinyl end-caps for norbornenyl end-caps in addition polyimides results in polymers with improved oxidative stability.

  19. [Biologically active food additives].

    PubMed

    Velichko, M A; Shevchenko, V P

    1998-07-01

    More than half out of 40 projects for the medical science development by the year of 2000 have been connected with the bio-active edible additives that are called "the food of XXI century", non-pharmacological means for many diseases. Most of these additives--nutricevtics and parapharmacevtics--are intended for the enrichment of food rations for the sick or healthy people. The ecologicaly safest and most effective are combined domestic adaptogens with immuno-modulating and antioxidating action that give anabolic and stimulating effect,--"leveton", "phytoton" and "adapton". The MKTs-229 tablets are residue discharge means. For atherosclerosis and general adiposis they recommend "tsar tablets" and "aiconol (ikhtien)"--on the base of cod-liver oil or "splat" made out of seaweed (algae). All these preparations have been clinically tested and received hygiene certificates from the Institute of Dietology of the Russian Academy of Medical Science. PMID:9752776

  20. Electrophilic addition of astatine

    SciTech Connect

    Norseev, Yu.V.; Vasaros, L.; Nhan, D.D.; Huan, N.K.

    1988-03-01

    It has been shown for the first time that astatine is capable of undergoing addition reactions to unsaturated hydrocarbons. A new compound of astatine, viz., ethylene astatohydrin, has been obtained, and its retention numbers of squalane, Apiezon, and tricresyl phosphate have been found. The influence of various factors on the formation of ethylene astatohydrin has been studied. It has been concluded on the basis of the results obtained that the univalent cations of astatine in an acidic medium is protonated hypoastatous acid.

  1. Hydrocarbon fuel additive

    SciTech Connect

    Ambrogio, S.

    1989-02-28

    This patent describes the method of fuel storage or combustion, wherein the fuel supply contains small amounts of water, the step of adding to the fuel supply an additive comprising a blend of a hydrophilic agent chosen from the group of ethylene glycol, n-butyl alcohol, and cellosolve in the range of 22-37% by weight; ethoxylated nonylphenol in the range of 26-35% by weight; nonylphenol polyethylene glycol ether in the range of 32-43% by weight.

  2. Hanford analytical sample projections FY 1996 - FY 2001. Revision 4

    SciTech Connect

    Joyce, S.M.

    1997-07-02

    This document summarizes the biannual Hanford sample projections for fiscal year 1997-2001. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Wastes Remediation Systems, Solid Wastes, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition to this revision, details on Laboratory scale technology (development), Sample management, and Data management activities were requested. This information will be used by the Hanford Analytical Services program and the Sample Management Working Group to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  3. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online. PMID:24729671

  4. Gaseous analytes of concern at Hanford Tank Farms. Topical report

    SciTech Connect

    1996-03-01

    Large amounts of toxic and radioactive waste materials are stored in underground tanks at DOE sites. When the vapors in the tank headspaces vent to the open atmosphere a potentially dangerous situation can occur for personnel in the area. An open-path atmospheric pollution monitor is being developed for DOE to monitor the open air space above these tanks. In developing this monitor it is important to know what hazardous gases are most likely to be found in dangerous concentrations. These gases are called the Analytes of Concern. At the present time, measurements in eight tanks have detected thirty-one analytes in at least two tanks and fifteen analytes in only one tank. In addition to these gases, Carbon tetrachloride is considered to be an Analyte of Concern because it permeates the ground around the tanks. These Analytes are described and ranked according to a Hazard Index which combines their vapor pressure, density, and approximate danger level. The top sixteen ranked analytes which have been detected in at least two tanks comprise an {open_quotes}Analytes of Concern Test List{close_quotes} for determining the system performance of the atmospheric pollution monitor under development. A preliminary examination of the infrared spectra, barring atmospheric interferences, indicates that: The pollution monitor will detect all forty-seven Analytes!

  5. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  6. Analytical investigation of a semi-empirical flow-induced vibration model

    NASA Astrophysics Data System (ADS)

    Allison, Anne-Marie Elizabeth

    1998-12-01

    The thesis is an investigation of the effects of two types of flow-induced vibration on structures of square cross-section under two-dimensional conditions: vortex- induced vibration and galloping. In particular, the semi- empirical, mathematical model proposed by Tamura and Shimada (123) is examined. The model incorporates the effects of the oscillating wake by coupling the equation for the cylinder motion with an equation for the angular displacement of the (Birkhoff type) wake-oscillator. The model equations are examined by analytical means in the quest for stability and bifurcation information. The effects of model parameters are of primary interest. The analytical methods used are much more efficient than numerical solutions. The method of multiple scales is used to obtain slow-flow equations for the amplitudes of the cylinder displacement and of the angular displacement of the wake-oscillator and for the phase shift between the two oscillators. In addition to using Grobner bases in the solution route, parametric solutions of the slow-flow equations are presented. The bifurcation analysis leads to some new and useful tools for dealing with polynomial systems with symbolic coefficients. Primarily, the promising new concept of an approximate bifurcation set is developed in Chapter 7. The results for the nonresonance region imply that the Birkhoff wake-oscillator is a useful idea and this is thus followed up by a more elaborate study of the resonance region. The Tamura-Shimada model is capable of exhibiting a wide range of behavior for the transverse oscillations of square cylinders. The computer algebra language M scAPLE is an essential tool for the results presented. The numerical package M scATLAB is also useful in verifications of the analytical work as well as the stability and bifurcation studies.

  7. A new methodology for predictive tool wear

    NASA Astrophysics Data System (ADS)

    Kim, Won-Sik

    turned with various cutting conditions and the results were compared with the proposed analytical wear models. The crater surfaces after machining have been carefully studied to shed light on the physics behind the crater wear. In addition, the abrasive wear mechanism plays a major role in the development of crater wear. Laser shock processing (LSP) has been applied to locally relieve the deleterious tensile residual stresses on the crater surface of a coated tool, thus to improve the hardness of the coating. This thesis shows that LSP has indeed improve wear resistance of CVD coated alumina tool inserts, which has residual stress due to high processing temperature. LSP utilizes a very short laser pulse with high energy density, which induces high-pressure stress wave propagation. The residual stresses are relieved by incident shock waves on the coating surface. Residual stress levels of LSP CVD alumina-coated carbide insert were evaluated by the X-ray diffractometer. Based on these results, LSP parameters such as number of laser pulses and laser energy density can be controlled to reduce residual stress. Crater wear shows that the wear resistance increase with LSP treated tool inserts. Because the hardness data are used to predict the wear, the improvement in hardness and wear resistance shows that the mechanism of crater wear also involves abrasive wear.

  8. Trends in Analytical Scale Separations.

    ERIC Educational Resources Information Center

    Jorgenson, James W.

    1984-01-01

    Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)

  9. Liposomes: Technologies and Analytical Applications

    NASA Astrophysics Data System (ADS)

    Jesorka, Aldo; Orwar, Owe

    2008-07-01

    Liposomes are structurally and functionally some of the most versatile supramolecular assemblies in existence. Since the beginning of active research on lipid vesicles in 1965, the field has progressed enormously and applications are well established in several areas, such as drug and gene delivery. In the analytical sciences, liposomes serve a dual purpose: Either they are analytes, typically in quality-assessment procedures of liposome preparations, or they are functional components in a variety of new analytical systems. Liposome immunoassays, for example, benefit greatly from the amplification provided by encapsulated markers, and nanotube-interconnected liposome networks have emerged as ultrasmall-scale analytical devices. This review provides information about new developments in some of the most actively researched liposome-related topics.

  10. Laboratory Workhorse: The Analytical Balance.

    ERIC Educational Resources Information Center

    Clark, Douglas W.

    1979-01-01

    This report explains the importance of various analytical balances in the water or wastewater laboratory. Stressed is the proper procedure for utilizing the equipment as well as the mechanics involved in its operation. (CS)

  11. Analytic Methods in Investigative Geometry.

    ERIC Educational Resources Information Center

    Dobbs, David E.

    2001-01-01

    Suggests an alternative proof by analytic methods, which is more accessible than rigorous proof based on Euclid's Elements, in which students need only apply standard methods of trigonometry to the data without introducing new points or lines. (KHR)

  12. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  13. Cautions Concerning Electronic Analytical Balances.

    ERIC Educational Resources Information Center

    Johnson, Bruce B.; Wells, John D.

    1986-01-01

    Cautions chemists to be wary of ferromagnetic samples (especially magnetized samples), stray electromagnetic radiation, dusty environments, and changing weather conditions. These and other conditions may alter readings obtained from electronic analytical balances. (JN)

  14. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published. PMID:18495751

  15. Evaluation of Visual Analytics Environments: The Road to the Visual Analytics Science and Technology Challenge Evaluation Methodology

    SciTech Connect

    Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.; Grinstein, Georges

    2014-09-28

    The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation of a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many

  16. Functionalized magnetic nanoparticle analyte sensor

    DOEpatents

    Yantasee, Wassana; Warner, Maryin G; Warner, Cynthia L; Addleman, Raymond S; Fryxell, Glen E; Timchalk, Charles; Toloczko, Mychailo B

    2014-03-25

    A method and system for simply and efficiently determining quantities of a preselected material in a particular solution by the placement of at least one superparamagnetic nanoparticle having a specified functionalized organic material connected thereto into a particular sample solution, wherein preselected analytes attach to the functionalized organic groups, these superparamagnetic nanoparticles are then collected at a collection site and analyzed for the presence of a particular analyte.

  17. Analytical multikinks in smooth potentials

    NASA Astrophysics Data System (ADS)

    de Brito, G. P.; Correa, R. A. C.; de Souza Dutra, A.

    2014-03-01

    In this work we present an approach that can be systematically used to construct nonlinear systems possessing analytical multikink profile configurations. In contrast with previous approaches to the problem, we are able to do it by using field potentials that are considerably smoother than the ones of the doubly quadratic family of potentials. This is done without losing the capacity of writing exact analytical solutions. The resulting field configurations can be applied to the study of problems from condensed matter to braneworld scenarios.

  18. Additive manufacturing of materials: Opportunities and challenges

    DOE PAGESBeta

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Dehoff, Ryan R.; Peter, William H.; Watkins, Thomas R.; Pannala, Sreekanth

    2015-11-01

    Additive manufacturing (also known as 3D printing) is considered a disruptive technology for producing components with topologically optimized complex geometries as well as functionalities that are not achievable by traditional methods. The realization of the full potential of 3D printing is stifled by a lack of computational design tools, generic material feedstocks, techniques for monitoring thermomechanical processes under in situ conditions, and especially methods for minimizing anisotropic static and dynamic properties brought about by microstructural heterogeneity. In this paper, we discuss the role of interdisciplinary research involving robotics and automation, process control, multiscale characterization of microstructure and properties, and high-performancemore » computational tools to address each of these challenges. In addition, emerging pathways to scale up additive manufacturing of structural materials to large sizes (>1 m) and higher productivities (5–20 kg/h) while maintaining mechanical performance and geometrical flexibility are also discussed.« less

  19. Additive manufacturing of materials: Opportunities and challenges

    SciTech Connect

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Dehoff, Ryan R.; Peter, William H.; Watkins, Thomas R.; Pannala, Sreekanth

    2015-11-01

    Additive manufacturing (also known as 3D printing) is considered a disruptive technology for producing components with topologically optimized complex geometries as well as functionalities that are not achievable by traditional methods. The realization of the full potential of 3D printing is stifled by a lack of computational design tools, generic material feedstocks, techniques for monitoring thermomechanical processes under in situ conditions, and especially methods for minimizing anisotropic static and dynamic properties brought about by microstructural heterogeneity. In this paper, we discuss the role of interdisciplinary research involving robotics and automation, process control, multiscale characterization of microstructure and properties, and high-performance computational tools to address each of these challenges. In addition, emerging pathways to scale up additive manufacturing of structural materials to large sizes (>1 m) and higher productivities (5–20 kg/h) while maintaining mechanical performance and geometrical flexibility are also discussed.

  20. Evaluating Visual Analytics at the 2007 VAST Symposium Contest

    SciTech Connect

    Plaisant, Catherine; Grinstein, Georges; Scholtz, Jean; Whiting, Mark A.; O'Connell, Theresa; Laskowski, Sharon; Chien, Lynn; Tat, Annie; Wright, William; Gorg, Carsten; Lui, Zhicheng; Parekh, Neel; Singhal, Kanupriya; Stasko, John T.

    2008-03-01

    The second Visual Analytics Science and Technology (VAST) contest was held in conjunction with the 2007 IEEE VAST Symposium. A synthetic data set was created containing a known scenario with embedded threats, therefore providing ground truth. Participants used visual analytic tools to explore the heterogeneous data collection and find the evidence of illegal and possible terrorist activities in the data. We describe the contest and the evaluation methodology, then provide details on the results and the two winning entries. Lessons learned are reported from different points of view: contest committee, participants, judges, stake holders and other researchers.

  1. Biomedical Applications of the APPS-IV Analytical Plotter

    PubMed Central

    Passauer, James L.; Niedzwiadek, Harry A.; Molander, Craig W.

    1980-01-01

    Photogrammetry, the science of extracting information from photography and imagery, offers the biomedical field virtually unlimited aid in the quantitative, non-contact evaluation of body form, function, and detail. The APPS-IV Analytical Plotter, developed by Autometric, Inc., can serve as a mensuration tool in the extraction of detail from photography. The APPS-IV is a microprocessor-controlled, cost-effective photogrammetric instrument which places few demands on a modest host computer. The use of analytical techniques provides a significant advance over traditional analog methods in speed and accuracy of measurement.

  2. Algal functional annotation tool

    2012-07-12

    Abstract BACKGROUND: Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations tomore » interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. DESCRIPTION: The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on

  3. Algal functional annotation tool

    SciTech Connect

    2012-07-12

    Abstract BACKGROUND: Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. DESCRIPTION: The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG

  4. Analytic Challenges Arising from the STOP CRC Trial: Pragmatic Solutions for Pragmatic Problems

    PubMed Central

    Vollmer, William M.; Green, Beverly B.; Coronado, Gloria D.

    2015-01-01

    Context: Pragmatic trials lack the relatively tight quality control of traditional efficacy studies and hence may pose added analytic challenges owing to the practical realities faced in carrying them out. Case Description: STOP CRC is a cluster randomized trial testing the effectiveness of automated, electronic medical record (EMR)-driven strategies to raise colorectal cancer (CRC) screening rates in safety net clinics. Screen-eligible participants were accrued during year 1 and followed for 12 months (measurement window) to assess completion of a fecal screening test. Control clinics implemented the intervention in year 2. Implementation Challenges/Analytic Issues: Due to limitations on how we could build the intervention tools, the overlap of the year 1 measurement windows with year 2 intervention rollout posed a potential for contamination of the primary outcome for control participants. In addition, a variety of factors led to a lack of synchronization of the measurement windows with actual intervention delivery. In both cases, the net impact of these factors would be to diminish the estimated impact of the intervention. Proposed Solutions: We dealt with the overlap issue by delaying the start of intervention rollout to control clinics in year 2 by 6 months and by truncating the measurement windows for intervention and control participants at this point. In addition we formulated three sensitivity analyses to help address the issue of asynchronization. Conclusion: This case study might help other investigators facing similar challenges think about such issues and the pros and cons of various strategies for dealing with them. PMID:26793738

  5. Simplified, inverse, ejector design tool

    NASA Technical Reports Server (NTRS)

    Dechant, Lawrence J.

    1993-01-01

    A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.

  6. Siloxane containing addition polyimides

    NASA Technical Reports Server (NTRS)

    Maudgal, S.; St. Clair, T. L.

    1984-01-01

    Addition polyimide oligomers have been synthesized from bis(gamma-aminopropyl) tetramethyldisiloxane and 3, 3', 4, 4'-benzophenonetetracarboxylic dianhydride using a variety of latent crosslinking groups as endcappers. The prepolymers were isolated and characterized for solubility (in amide, chlorinated and ether solvents), melt flow and cure properties. The most promising systems, maleimide and acetylene terminated prepolymers, were selected for detailed study. Graphite cloth reinforced composites were prepared and properties compared with those of graphite/Kerimid 601, a commercially available bismaleimide. Mixtures of the maleimide terminated system with Kerimid 601, in varying proportions, were also studied.

  7. Oil additive process

    SciTech Connect

    Bishop, H.

    1988-10-18

    This patent describes a method of making an additive comprising: (a) adding 2 parts by volume of 3% sodium hypochlorite to 45 parts by volume of diesel oil fuel to form a sulphur free fuel, (b) removing all water and foreign matter formed by the sodium hypochlorite, (c) blending 30 parts by volume of 24% lead naphthanate with 15 parts by volume of the sulphur free fuel, 15 parts by volume of light-weight material oil to form a blended mixture, and (d) heating the blended mixture slowly and uniformly to 152F.

  8. Additivity, density fluctuations, and nonequilibrium thermodynamics for active Brownian particles

    NASA Astrophysics Data System (ADS)

    Chakraborti, Subhadip; Mishra, Shradha; Pradhan, Punyabrata

    2016-05-01

    Using an additivity property, we study particle-number fluctuations in a system of interacting self-propelled particles, called active Brownian particles (ABPs), which consists of repulsive disks with random self-propulsion velocities. From a fluctuation-response relation, a direct consequence of additivity, we formulate a thermodynamic theory which captures the previously observed features of nonequilibrium phase transition in the ABPs from a homogeneous fluid phase to an inhomogeneous phase of coexisting gas and liquid. We substantiate the predictions of additivity by analytically calculating the subsystem particle-number distributions in the homogeneous fluid phase away from criticality where analytically obtained distributions are compatible with simulations in the ABPs.

  9. Recent advances in analytical satellite theory

    NASA Technical Reports Server (NTRS)

    Gaposchkin, E. M.

    1978-01-01

    Recent work on analytical satellite perturbation theory has involved the completion of a revision to 4th order for zonal harmonics, the addition of a treatment for ocean tides, an extension of the treatment for the noninertial reference system, and the completion of a theory for direct solar-radiation pressure and earth-albedo pressure. Combined with a theory for tesseral-harmonics, lunisolar, and body-tide perturbations, these formulations provide a comprehensive orbit-computation program. Detailed comparisons with numerical integration and observations are presented to assess the accuracy of each theoretical development.

  10. Data Intensive Architecture for Scalable Cyber Analytics

    SciTech Connect

    Olsen, Bryan K.; Johnson, John R.; Critchlow, Terence J.

    2011-11-15

    Cyber analysts are tasked with the identification and mitigation of network exploits and threats. These compromises are difficult to identify due to the characteristics of cyber communication, the volume of traffic, and the duration of possible attack. It is necessary to have analytical tools to help analysts identify anomalies that span seconds, days, and weeks. Unfortunately, providing analytical tools effective access to the volumes of underlying data requires novel architectures, which is often overlooked in operational deployments. Our work is focused on a summary record of communication, called a flow. Flow records are intended to summarize a communication session between a source and a destination, providing a level of aggregation from the base data. Despite this aggregation, many enterprise network perimeter sensors store millions of network flow records per day. The volume of data makes analytics difficult, requiring the development of new techniques to efficiently identify temporal patterns and potential threats. The massive volume makes analytics difficult, but there are other characteristics in the data which compound the problem. Within the billions of records of communication that transact, there are millions of distinct IP addresses involved. Characterizing patterns of entity behavior is very difficult with the vast number of entities that exist in the data. Research has struggled to validate a model for typical network behavior with hopes it will enable the identification of atypical behavior. Complicating matters more, typically analysts are only able to visualize and interact with fractions of data and have the potential to miss long term trends and behaviors. Our analysis approach focuses on aggregate views and visualization techniques to enable flexible and efficient data exploration as well as the capability to view trends over long periods of time. Realizing that interactively exploring summary data allowed analysts to effectively identify

  11. [Critical reading of analytical observational studies].

    PubMed

    García Villar, C; Marín León, I

    2015-11-01

    Analytical observational studies provide very important information about real-life clinical practice and the natural history of diseases and can suggest causality. Furthermore, they are very common in scientific journals. The aim of this article is to review the main concepts necessary for the critical reading of articles about radiological studies with observational designs. It reviews the characteristics that case-control and cohort studies must have to ensure high quality. It explains a method of critical reading that involves checking the attributes that should be evaluated in each type of article using a structured list of specific questions. It underlines the main characteristics that confer credibility and confidence on the article evaluated. Readers are provided with tools for the critical analysis of the observational studies published in scientific journals. PMID:26123855

  12. An analytic performance model of disk arrays and its application

    NASA Technical Reports Server (NTRS)

    Lee, Edward K.; Katz, Randy H.

    1991-01-01

    As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.

  13. Personal Visualization and Personal Visual Analytics.

    PubMed

    Huang, Dandan; Tory, Melanie; Aseniero, Bon Adriel; Bartram, Lyn; Bateman, Scott; Carpendale, Sheelagh; Tang, Anthony; Woodbury, Robert

    2015-03-01

    Data surrounds each and every one of us in our daily lives, ranging from exercise logs, to archives of our interactions with others on social media, to online resources pertaining to our hobbies. There is enormous potential for us to use these data to understand ourselves better and make positive changes in our lives. Visualization (Vis) and visual analytics (VA) offer substantial opportunities to help individuals gain insights about themselves, their communities and their interests; however, designing tools to support data analysis in non-professional life brings a unique set of research and design challenges. We investigate the requirements and research directions required to take full advantage of Vis and VA in a personal context. We develop a taxonomy of design dimensions to provide a coherent vocabulary for discussing personal visualization and personal visual analytics. By identifying and exploring clusters in the design space, we discuss challenges and share perspectives on future research. This work brings together research that was previously scattered across disciplines. Our goal is to call research attention to this space and engage researchers to explore the enabling techniques and technology that will support people to better understand data relevant to their personal lives, interests, and needs. PMID:26357073

  14. Performance Boosting Additive

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Mainstream Engineering Corporation was awarded Phase I and Phase II contracts from Goddard Space Flight Center's Small Business Innovation Research (SBIR) program in early 1990. With support from the SBIR program, Mainstream Engineering Corporation has developed a unique low cost additive, QwikBoost (TM), that increases the performance of air conditioners, heat pumps, refrigerators, and freezers. Because of the energy and environmental benefits of QwikBoost, Mainstream received the Tibbetts Award at a White House Ceremony on October 16, 1997. QwikBoost was introduced at the 1998 International Air Conditioning, Heating, and Refrigeration Exposition. QwikBoost is packaged in a handy 3-ounce can (pressurized with R-134a) and will be available for automotive air conditioning systems in summer 1998.

  15. Sewage sludge additive

    NASA Technical Reports Server (NTRS)

    Kalvinskas, J. J.; Mueller, W. A.; Ingham, J. D. (Inventor)

    1980-01-01

    The additive is for a raw sewage treatment process of the type where settling tanks are used for the purpose of permitting the suspended matter in the raw sewage to be settled as well as to permit adsorption of the dissolved contaminants in the water of the sewage. The sludge, which settles down to the bottom of the settling tank is extracted, pyrolyzed and activated to form activated carbon and ash which is mixed with the sewage prior to its introduction into the settling tank. The sludge does not provide all of the activated carbon and ash required for adequate treatment of the raw sewage. It is necessary to add carbon to the process and instead of expensive commercial carbon, coal is used to provide the carbon supplement.

  16. Perspectives on Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Bourell, David L.

    2016-07-01

    Additive manufacturing (AM) has skyrocketed in visibility commercially and in the public sector. This article describes the development of this field from early layered manufacturing approaches of photosculpture, topography, and material deposition. Certain precursors to modern AM processes are also briefly described. The growth of the field over the last 30 years is presented. Included is the standard delineation of AM technologies into seven broad categories. The economics of AM part generation is considered, and the impacts of the economics on application sectors are described. On the basis of current trends, the future outlook will include a convergence of AM fabricators, mass-produced AM fabricators, enabling of topology optimization designs, and specialization in the AM legal arena. Long-term developments with huge impact are organ printing and volume-based printing.

  17. New addition curing polyimides

    NASA Technical Reports Server (NTRS)

    Frimer, Aryeh A.; Cavano, Paul

    1991-01-01

    In an attempt to improve the thermal-oxidative stability (TOS) of PMR-type polymers, the use of 1,4-phenylenebis (phenylmaleic anhydride) PPMA, was evaluated. Two series of nadic end-capped addition curing polyimides were prepared by imidizing PPMA with either 4,4'-methylene dianiline or p-phenylenediamine. The first resulted in improved solubility and increased resin flow while the latter yielded a compression molded neat resin sample with a T(sub g) of 408 C, close to 70 C higher than PME-15. The performance of these materials in long term weight loss studies was below that of PMR-15, independent of post-cure conditions. These results can be rationalized in terms of the thermal lability of the pendant phenyl groups and the incomplete imidization of the sterically congested PPMA. The preparation of model compounds as well as future research directions are discussed.

  18. Analytical gradients for excitation energies from frozen-density embedding.

    PubMed

    Kovyrshin, Arseny; Neugebauer, Johannes

    2016-08-21

    The formulation of analytical excitation-energy gradients from time-dependent density functional theory within the frozen-density embedding framework is presented. In addition to a comprehensive mathematical derivation, we discuss details of the numerical implementation in the Slater-function based Amsterdam Density Functional (ADF) program. Particular emphasis is put on the consistency in the use of approximations for the evaluation of second- and third-order non-additive kinetic-energy and exchange-correlation functional derivatives appearing in the final expression for the excitation-energy gradient. We test the implementation for different chemical systems in which molecular excited-state potential-energy curves are affected by another subsystem. It is demonstrated that the analytical implementation for the evaluation of excitation-energy gradients yields results in close agreement with data from numerical differentiation. In addition, we show that our analytical results are numerically more stable and thus preferable over the numerical ones. PMID:26996970

  19. IT vendor selection model by using structural equation model & analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  20. A guide to genome-wide association analysis and post-analytic interrogation.

    PubMed

    Reed, Eric; Nunez, Sara; Kulp, David; Qian, Jing; Reilly, Muredach P; Foulkes, Andrea S

    2015-12-10

    This tutorial is a learning resource that outlines the basic process and provides specific software tools for implementing a complete genome-wide association analysis. Approaches to post-analytic visualization and interrogation of potentially novel findings are also presented. Applications are illustrated using the free and open-source R statistical computing and graphics software environment, Bioconductor software for bioinformatics and the UCSC Genome Browser. Complete genome-wide association data on 1401 individuals across 861,473 typed single nucleotide polymorphisms from the PennCATH study of coronary artery disease are used for illustration. All data and code, as well as additional instructional resources, are publicly available through the Open Resources in Statistical Genomics project: http://www.stat-gen.org. PMID:26343929