Science.gov

Sample records for additional analytical tools

  1. Analytic tools for information warfare

    SciTech Connect

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  2. Analytical Web Tool for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  3. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...early guidance and organizational contacts to get us started in this effort. They provided a basic cultural/ institutional /psychological framework for the...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to

  4. Analytical tool requirements for power system restoration

    SciTech Connect

    Adibi, M.M. ); Borkoski, J.N. ); Kafka, R.J. )

    1994-08-01

    This paper is one of series presented by Power System Restoration Working Group (SRWG) on behalf of the System, Operation Subcommittee with the intent of focusing industry attention on power system restoration. In this paper a set of analytical tools is specified which together describe the static, transient and dynamic behavior of a power system during restoration. These tools are identified and described for restoration planning, training and operation. Their applications cover all stages of restoration including pre-disturbance condition, post-disturbance status, post-restoration target system, and minimization of unserved loads. The paper draws on the previous reports by the SRWG.

  5. Analytical tools and isolation of TOF events

    NASA Technical Reports Server (NTRS)

    Wolf, H.

    1974-01-01

    Analytical tools are presented in two reports. The first is a probability analysis of the orbital distribution of events in relation to dust flux density observed in Pioneer 8 and 9 distributions. A distinction is drawn between asymmetries caused by random fluctuations and systematic variations, by calculating the probability of any particular asymmetry. The second article discusses particle trajectories for a repulsive force field. The force on a particle due to solar radiation pressure is directed along the particle's radius vector, from the sun, and is inversely proportional to its distance from the sun. Equations of motion which describe both solar radiation pressure and gravitational attraction are presented.

  6. Additive manufacturing of tools for lapping glass

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2013-09-01

    Additive manufacturing technologies have the ability to directly produce parts with complex geometries without the need for secondary processes, tooling or fixtures. This ability was used to produce concave lapping tools with a VFlash 3D printer from 3D Systems. The lapping tools were first designed in Creo Parametric with a defined constant radius and radial groove pattern. The models were converted to stereolithography files which the VFlash used in building the parts, layer by layer, from a UV curable resin. The tools were rotated at 60 rpm and used with 120 grit and 220 grit silicon carbide lapping paste to lap 0.750" diameter fused silica workpieces. The samples developed a matte appearance on the lapped surface that started as a ring at the edge of the workpiece and expanded to the center. This indicated that as material was removed, the workpiece radius was beginning to match the tool radius. The workpieces were then cleaned and lapped on a second tool (with equivalent geometry) using a 3000 grit corundum aluminum oxide lapping paste, until a near specular surface was achieved. By using lapping tools that have been additively manufactured, fused silica workpieces can be lapped to approach a specified convex geometry. This approach may enable more rapid lapping of near net shape workpieces that minimize the material removal required by subsequent polishing. This research may also enable development of new lapping tool geometry and groove patterns for improved loose abrasive finishing.

  7. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  8. ANALYTICAL TOOL DEVELOPMENT FOR AFTERTREATMENT SUB-SYSTEMS INTEGRATION

    SciTech Connect

    Bolton, B; Fan, A; Goney, K; Pavlova-MacKinnon, Z; Sisken, K; Zhang, H

    2003-08-24

    The stringent emissions standards of 2007 and beyond require complex engine, aftertreatment and vehicle systems with a high degree of sub-system interaction and flexible control solutions. This necessitates a system-based approach to technology development, in addition to individual sub-system optimization. Analytical tools can provide an effective means to evaluate and develop such complex technology interactions as well as understand phenomena that is either too expensive or impossible to study with conventional experimental means. The analytical effort can also guide experimental development and thus lead to efficient utilization of available experimental resources.A suite of analytical models has been developed to represent PM and NOx aftertreatment sub-systems. These models range from computationally inexpensive zero-dimensional models for real-time control applications to CFD-based, multi-dimensional models with detailed temporal and spatial resolution. Such models in conjunction with well established engine modeling tools such as engine cycle simulation, engine controls modeling, CFD models of non-combusting and combusting flow, and vehicle models provide a comprehensive analytical toolbox for complete engine, aftertreatment and vehicle sub-systems development and system integration applications. However, the fidelity of aftertreatment models and application going forward is limited by the lack of fundamental kinetic data.

  9. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  10. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  11. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  12. Analytical tools for groundwater pollution assessment

    SciTech Connect

    Hantush, M.M.; Islam, M.R.; Marino, M.A.

    1998-06-01

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of ground water buffer strips. The indices describe the leaching of solutes below the root zone (mass fraction), emissions to the water table, and mass fraction of the contaminant intercepted by a well or a surface water body.

  13. Guidance for the Design and Adoption of Analytic Tools.

    SciTech Connect

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  14. Electronic tongue: An analytical gustatory tool.

    PubMed

    Latha, Rewanthwar Swathi; Lakshmi, P K

    2012-01-01

    Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA)-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue) which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  15. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  16. Analytical Tools for Cloudscope Ice Measurement

    NASA Technical Reports Server (NTRS)

    Arnott, W. Patrick

    1998-01-01

    The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that

  17. Chemometrics tools used in analytical chemistry: an overview.

    PubMed

    Kumar, Naveen; Bansal, Ankit; Sarma, G S; Rawal, Ravindra K

    2014-06-01

    This article presents various important tools of chemometrics utilized as data evaluation tools generated by various hyphenated analytical techniques including their application since its advent to today. The work has been divided into various sections, which include various multivariate regression methods and multivariate resolution methods. Finally the last section deals with the applicability of chemometric tools in analytical chemistry. The main objective of this article is to review the chemometric methods used in analytical chemistry (qualitative/quantitative), to determine the elution sequence, classify various data sets, assess peak purity and estimate the number of chemical components. These reviewed methods further can be used for treating n-way data obtained by hyphenation of LC with multi-channel detectors. We prefer to provide a detailed view of various important methods developed with their algorithm in favor of employing and understanding them by researchers not very familiar with chemometrics.

  18. Using Visual Analytics Tool for Improving Data Comprehension

    ERIC Educational Resources Information Center

    Géryk, Jan

    2015-01-01

    The efficacy of animated data visualizations in comparison with static data visualizations is still inconclusive. Some researches resulted that the failure to find out the benefits of animations may relate to the way how they are constructed and perceived. In this paper, we present visual analytics (VA) tool which makes use of enhanced animated…

  19. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  20. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  1. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  2. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  3. Trial analytics--a tool for clinical trial management.

    PubMed

    Bose, Anindya; Das, Suman

    2012-01-01

    Prolonged timelines and large expenses associated with clinical trials have prompted a new focus on improving the operational efficiency of clinical trials by use of Clinical Trial Management Systems (CTMS) in order to improve managerial control in trial conduct. However, current CTMS systems are not able to meet the expectations due to various shortcomings like inability of timely reporting and trend visualization within/beyond an organization. To overcome these shortcomings of CTMS, clinical researchers can apply a business intelligence (BI) framework to create Clinical Research Intelligence (CLRI) for optimization of data collection and analytics. This paper proposes the usage of an innovative and collaborative visualization tool (CTA) as CTMS "add-on" to help overwhelm these deficiencies of traditional CTMS, with suitable examples.

  4. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  5. Additive Manufacturing of Tooling for Refrigeration Cabinet Foaming Processes

    SciTech Connect

    Post, Brian K; Nuttall, David; Cukier, Michael; Hile, Michael

    2016-07-29

    The primary objective of this project was to leverage the Big Area Additive Manufacturing (BAAM) process and materials into a long term, quick change tooling concept to drastically reduce product lead and development timelines and costs. Current refrigeration foam molds are complicated to manufacture involving casting several aluminum parts in an approximate shape, machining components of the molds and post fitting and shimming of the parts in an articulated fixture. The total process timeline can take over 6 months. The foaming process is slower than required for production, therefore multiple fixtures, 10 to 27, are required per refrigerator model. Molds are particular to a specific product configuration making mixed model assembly challenging for sequencing, mold changes or auto changeover features. The initial goal was to create a tool leveraging the ORNL materials and additive process to build a tool in 4 to 6 weeks or less. A secondary goal was to create common fixture cores and provide lightweight fixture sections that could be revised in a very short time to increase equipment flexibility reduce lead times, lower the barriers to first production trials, and reduce tooling costs.

  6. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  7. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  8. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    PubMed

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results.

  9. MATRIICES - Mass Analytical Tool for Reactions in Interstellar ICES

    NASA Astrophysics Data System (ADS)

    Isokoski, K.; Bossa, J. B.; Linnartz, H.

    2011-05-01

    The formation of complex organic molecules (COMs) observed in the inter- and circumstellar medium (ISCM) is driven by a complex chemical network yet to be fully characterized. Interstellar dust grains and the surrounding ice mantles, subject to atom bombardment, UV irradiation, and thermal processing, are believed to provide catalytic sites for such chemistry. However, the solid state chemical processes and the level of complexity reachable under astronomical conditions remain poorly understood. The conventional laboratory techniques used to characterize the solid state reaction pathways - RAIRS (Reflection Absorption IR Spectroscopy) and TPD (Temperature-Programmed Desorption) - are suitable for the analysis of reactions in ices made of relatively small molecules. For more complex ices comprising a series of different components as relevant to the interstellar medium, spectral overlapping prohibits unambiguous identification of reaction schemes, and these techniques start to fail. Therefore, we have constructed a new and innovative experimental set up for the study of complex interstellar ices featuring a highly sensitive and unambiguous detection method. MATRIICES (Mass Analytical Tool for Reactions in Interstellar ICES) combines Laser Ablation technique with a molecular beam experiment and Time-Of-Flight Mass Spectrometry (LA-TOF-MS) to sample and analyze the ice analogues in situ, at native temperatures, under clean ultra-high vacuum conditions. The method allows direct sampling and analysis of the ice constituents in real time, by using a pulsed UV ablation laser (355-nm Nd:YAG) to vaporize the products in a MALDI-TOF like detection scheme. The ablated material is caught in a synchronously pulsed molecular beam of inert carrier gas (He) from a supersonic valve, and analysed in a Reflectron Time-of-Flight Mass Spectrometer. The detection limit of the method is expected to exceed that of the regular surface techniques substantially. The ultimate goal is to fully

  10. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  11. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  12. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  13. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  14. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  15. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  16. Analytical tools for the analysis of β-carotene and its degradation products.

    PubMed

    Stutz, H; Bresgen, N; Eckl, P M

    2015-05-01

    β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation.

  17. Single cell analytic tools for drug discovery and development

    PubMed Central

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  18. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  19. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  20. Galileo's Discorsi as a Tool for the Analytical Art.

    PubMed

    Raphael, Renee Jennifer

    2015-01-01

    A heretofore overlooked response to Galileo's 1638 Discorsi is described by examining two extant copies of the text (one which has received little attention in the historiography, the other apparently unknown) which are heavily annotated. It is first demonstrated that these copies contain annotations made by Seth Ward and Sir Christopher Wren. This article then examines one feature of Ward's and Wren's responses to the Discorsi, namely their decision to re-write several of Galileo's geometrical demonstrations into the language of symbolic algebra. It is argued that this type of active reading of period mathematical texts may have been part of the regular scholarly and pedagogical practices of early modern British mathematicians like Ward and Wren. A set of Appendices contains a transcription and translation of the analytical solutions found in these annotated copies.

  1. Polymerase chain reaction technology as analytical tool in agricultural biotechnology.

    PubMed

    Lipp, Markus; Shillito, Raymond; Giroux, Randal; Spiegelhalter, Frank; Charlton, Stacy; Pinero, David; Song, Ping

    2005-01-01

    The agricultural biotechnology industry applies polymerase chain reaction (PCR) technology at numerous points in product development. Commodity and food companies as well as third-party diagnostic testing companies also rely on PCR technology for a number of purposes. The primary use of the technology is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of PCR analysis and its application to the testing of grains. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effect they may have on the accuracy of the PCR analytical results.

  2. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  3. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  4. Analytical and Semi-Analytical Tools for the Design of Oscillatory Pumping Tests.

    PubMed

    Cardiff, Michael; Barrash, Warren

    2015-01-01

    Oscillatory pumping tests-in which flow is varied in a periodic fashion-provide a method for understanding aquifer heterogeneity that is complementary to strategies such as slug testing and constant-rate pumping tests. During oscillatory testing, pressure data collected at non-pumping wells can be processed to extract metrics, such as signal amplitude and phase lag, from a time series. These metrics are robust against common sensor problems (including drift and noise) and have been shown to provide information about aquifer heterogeneity. Field implementations of oscillatory pumping tests for characterization, however, are not common and thus there are few guidelines for their design and implementation. Here, we use available analytical solutions from the literature to develop design guidelines for oscillatory pumping tests, while considering practical field constraints. We present two key analytical results for design and analysis of oscillatory pumping tests. First, we provide methods for choosing testing frequencies and flow rates which maximize the signal amplitude that can be expected at a distance from an oscillating pumping well, given design constraints such as maximum/minimum oscillator frequency and maximum volume cycled. Preliminary data from field testing helps to validate the methodology. Second, we develop a semi-analytical method for computing the sensitivity of oscillatory signals to spatially distributed aquifer flow parameters. This method can be quickly applied to understand the "sensed" extent of an aquifer at a given testing frequency. Both results can be applied given only bulk aquifer parameter estimates, and can help to optimize design of oscillatory pumping test campaigns.

  5. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    SciTech Connect

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  6. Analytical relationships for prediction of the mechanical properties of additively manufactured porous biomaterials

    PubMed Central

    Hedayati, Reza

    2016-01-01

    Abstract Recent developments in additive manufacturing techniques have motivated an increasing number of researchers to study regular porous biomaterials that are based on repeating unit cells. The physical and mechanical properties of such porous biomaterials have therefore received increasing attention during recent years. One of the areas that have revived is analytical study of the mechanical behavior of regular porous biomaterials with the aim of deriving analytical relationships that could predict the relative density and mechanical properties of porous biomaterials, given the design and dimensions of their repeating unit cells. In this article, we review the analytical relationships that have been presented in the literature for predicting the relative density, elastic modulus, Poisson's ratio, yield stress, and buckling limit of regular porous structures based on various types of unit cells. The reviewed analytical relationships are used to compare the mechanical properties of porous biomaterials based on different types of unit cells. The major areas where the analytical relationships have improved during the recent years are discussed and suggestions are made for future research directions. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 3164–3174, 2016. PMID:27502358

  7. Chemometric classification techniques as a tool for solving problems in analytical chemistry.

    PubMed

    Bevilacqua, Marta; Nescatelli, Riccardo; Bucci, Remo; Magrì, Andrea D; Magrì, Antonio L; Marini, Federico

    2014-01-01

    Supervised pattern recognition (classification) techniques, i.e., the family of chemometric methods whose aim is the prediction of a qualitative response on a set of samples, represent a very important assortment of tools for solving problems in several areas of applied analytical chemistry. This paper describes the theory behind the chemometric classification techniques most frequently used in analytical chemistry together with some examples of their application to real-world problems.

  8. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  9. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  10. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  11. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  12. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  13. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    PubMed

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  14. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  15. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  16. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  17. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    PubMed

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  18. Mössbauer spectroscopy: an excellent additional tool for the study of magnetic soils and sediments

    NASA Astrophysics Data System (ADS)

    Vandenberghe, R. E.; Hus, J. J.; de Grave, E.

    2009-04-01

    Since the discovery a half century ago of the resonant gamma absorption, known as the Mössbauer effect, the derived spectroscopic method (MS) has proven to be a very suitable tool for the characterization of soil and rock minerals. From the conventional absorption spectra of iron containing compounds, so-called hyperfine parameters are derived which are more or less typical for each kind of mineral. So, MS has a certain analytical power for the characterization of iron-bearing minerals. This is especially true for magnetic minerals for which the spectrum contains an additional hyperfine parameter. Moreover, MS also allows retrieving information about the magnetic structure and behavior. Because the relative area of the spectra is to some extent proportional to the amount of iron atoms in their environment, MS yields not only quantitative information about the various minerals present but also about the iron in the different crystallographic sites. The power of MS as an excellent additional tool for the study of magnetic soils and sediments could be well demonstrated in the joint research with Jozef Hus (CPG-IRM, Dourbes). In our common work, the emphasis went mainly to the study of Chinese loess and soils. Using MS on magnetically separated samples the various magnetic species in a loess and its associated soil were for the first time discerned in a direct way. Further, magnetically enriched samples of four different loess/paleosol couplets from a loess sequence in Huangling have been systematically investigated by MS. From the obtained qualitative and quantitative information the neoformation of magnetite/maghemite in the soils, responsible for the increased observed remanence and susceptibility, could be evidenced.

  19. Development of rocket electrophoresis technique as an analytical tool in preformulation study of tetanus vaccine formulation.

    PubMed

    Ahire, V J; Sawant, K K

    2006-08-01

    Rocket Electrophoresis (RE) technique relies on the difference in charges of the antigen and antibodies at the selected pH. The present study involves optimization of RE run conditions for Tetanus Toxoid (TT). Agarose gel (1% w/v, 20 ml, pH 8.6), anti-TT IgG - 1 IU/ml, temperature 4-8 degrees C and run duration of 18 h was found to be optimum. Height of the rocket-shaped precipitate was proportional to TT concentration. The RE method was found to be linear in the concentration range of 2.5 to 30 Lf/mL. The method was validated and found to be accurate, precise, and reproducible when analyzed statistically using student's t-test. RE was used as an analytical method for analyzing TT content in plain and marketed formulations as well as for the preformulation study of vaccine formulation where formulation additives were tested for compatibility with TT. The optimized RE method has several advantages: it uses safe materials, is inexpensive, and easy to perform. RE results are less prone to operator's bias as compared to flocculation test and can be documented by taking photographs and scanned by densitometer; RE can be easily standardized for the required antigen concentration by changing antitoxin concentration. It can be used as a very effective tool for qualitative and quantitative analysis and in preformulation studies of antigens.

  20. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  1. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    SciTech Connect

    Brown, Forrest B.

    2016-06-17

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  2. The impact of layer thickness on the performance of additively manufactured lapping tools

    NASA Astrophysics Data System (ADS)

    Williams, Wesley B.

    2015-10-01

    Lower cost additive manufacturing (AM) machines which have emerged in recent years are capable of producing tools, jigs, and fixtures that are useful in optical fabrication. In particular, AM tooling has been shown to be useful in lapping glass workpieces. Various AM machines are distinguished by the processes, materials, build times, and build resolution they provide. This research investigates the impact of varied build resolution (specifically layer resolution) on the lapping performance of tools built using the stereolithographic assembly (SLA) process in 50 μm and 100 μm layer thicknesses with a methacrylate photopolymer resin on a high resolution desktop printer. As with previous work, the lapping tools were shown to remove workpiece material during the lapping process, but the tools themselves also experienced significant wear on the order of 2-3 times the mass loss of the glass workpieces. The tool wear rates for the 100 μm and 50 μm layer tools were comparable, but the 50 μm layer tool was 74% more effective at removing material from the glass workpiece, which is attributed to some abrasive particles being trapped in the coarser surface of the 100 um layer tooling and not being available to interact with the glass workpiece. Considering the tool wear, these additively manufactured tools are most appropriate for prototype tooling where the low cost (<$45) and quick turnaround make them attractive when compared to a machined tool.

  3. Development of computer-based analytical tool for assessing physical protection system

    SciTech Connect

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  4. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  5. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    their support to MEF staffs with Psychological Operations planning teams in the near future. 2.2.4 Observations on the Situation Context...1998). ―Organizational Consulting: A Gestalt Approach,‖ Cambridge: GIC Press, 23 UNCLASSIFIED Analytical Tools for the Application of Operational...cultural learning and past experiences. What we perceive is often based on our needs, our expectation, our projections, our psychological defenses, and

  6. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    PubMed

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development.

  7. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  8. Analytical optimal controls for the state constrained addition and removal of cryoprotective agents

    PubMed Central

    Chicone, Carmen C.; Critser, John K.

    2014-01-01

    Cryobiology is a field with enormous scientific, financial and even cultural impact. Successful cryopreservation of cells and tissues depends on the equilibration of these materials with high concentrations of permeating chemicals (CPAs) such as glycerol or 1,2 propylene glycol. Because cells and tissues are exposed to highly anisosmotic conditions, the resulting gradients cause large volume fluctuations that have been shown to damage cells and tissues. On the other hand, there is evidence that toxicity to these high levels of chemicals is time dependent, and therefore it is ideal to minimize exposure time as well. Because solute and solvent flux is governed by a system of ordinary differential equations, CPA addition and removal from cells is an ideal context for the application of optimal control theory. Recently, we presented a mathematical synthesis of the optimal controls for the ODE system commonly used in cryobiology in the absence of state constraints and showed that controls defined by this synthesis were optimal. Here we define the appropriate model, analytically extend the previous theory to one encompassing state constraints, and as an example apply this to the critical and clinically important cell type of human oocytes, where current methodologies are either difficult to implement or have very limited success rates. We show that an enormous increase in equilibration efficiency can be achieved under the new protocols when compared to classic protocols, potentially allowing a greatly increased survival rate for human oocytes, and pointing to a direction for the cryopreservation of many other cell types. PMID:22527943

  9. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  10. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  11. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is

  12. Capillary electrophoresis as an analytical tool for monitoring nicotine in ATF regulated tobacco products.

    PubMed

    Ralapati, S

    1997-07-18

    Tobacco products are classified at different excise tax rates according to the Code of Federal Regulations. These include cigars, cigarettes, pipe tobacco, roll-your-own tobacco, chewing tobacco and snuff. Nicotine is the primary determinant of what constitutes a tobacco product from a regulatory standpoint. Determination of nicotine, therefore, is of primary importance and interest to ATF. Since nicotine is also the most abundant alkaloid found in tobacco, comprising about 98% of the total alkaloid content, a rapid method for the determination of nicotine in ATF regulated products is desirable. Capillary electrophoresis (CE), as an analytical technique, is rapidly gaining importance capturing the interest of analysts in several areas. The unique and powerful capabilities of CE including high resolution and short analysis times, make it a powerful analytical tool in the regulatory area as well. Preliminary studies using a 25 mM sodium phosphate buffer, pH 2.5 at 260 nm have yielded promising results for the analysis of nicotine in tobacco products. Application of an analytical method for the determination of nicotine by CE to ATF regulated tobacco products will be presented.

  13. Generalized net analyte signal standard addition as a novel method for simultaneous determination: application in spectrophotometric determination of some pesticides.

    PubMed

    Asadpour-Zeynali, Karim; Saeb, Elhameh; Vallipour, Javad; Bamorowat, Mehdi

    2014-01-01

    Simultaneous spectrophotometric determination of three neonicotinoid insecticides (acetamiprid, imidacloprid, and thiamethoxam) by a novel method named generalized net analyte signal standard addition method (GNASSAM) in some binary and ternary synthetic mixtures was investigated. For this purpose, standard addition was performed using a single standard solution consisting of a mixture of standards of all analytes. Savings in time and amount of used materials are some of the advantages of this method. All determinations showed appropriate applicability of this method with less than 5% error. This method may be applied for linearly dependent data in the presence of known interferents. The GNASSAM combines the advantages of both the generalized standard addition method and net analyte signal; therefore, it may be a proper alternative for some other multivariate methods.

  14. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  15. Can the analyte-triggered asymmetric autocatalytic Soai reaction serve as a universal analytical tool for measuring enantiopurity and assigning absolute configuration?

    PubMed

    Welch, Christopher J; Zawatzky, Kerstin; Makarov, Alexey A; Fujiwara, Satoshi; Matsumoto, Arimasa; Soai, Kenso

    2016-12-20

    An investigation is reported on the use of the autocatalytic enantioselective Soai reaction, known to be influenced by the presence of a wide variety of chiral materials, as a generic tool for measuring the enantiopurity and absolute configuration of any substance. Good generality for the reaction across a small group of test analytes was observed, consistent with literature reports suggesting a diversity of compound types that can influence the stereochemical outcome of this reaction. Some trends in the absolute sense of stereochemical enrichment were noted, suggesting the possible utility of the approach for assigning absolute configuration to unknown compounds, by analogy to closely related species with known outcomes. Considerable variation was observed in the triggering strength of different enantiopure materials, an undesirable characteristic when dealing with mixtures containing minor impurities with strong triggering strength in the presence of major components with weak triggering strength. A strong tendency of the reaction toward an 'all or none' type of behavior makes the reaction most sensitive for detecting enantioenrichment close to zero. Consequently, the ability to discern modest from excellent enantioselectivity was relatively poor. While these properties limit the ability to obtain precise enantiopurity measurements in a simple single addition experiment, prospects may exist for more complex experimental setups that may potentially offer improved performance.

  16. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  17. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    PubMed Central

    Alaidi, Osama; Rames, Matthew J.

    2016-01-01

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  18. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    PubMed

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated.

  19. The use of meta-analytical tools in risk assessment for food safety.

    PubMed

    Gonzales-Barron, Ursula; Butler, Francis

    2011-06-01

    This communication deals with the use of meta-analysis as a valuable tool for the synthesis of food safety research, and in quantitative risk assessment modelling. A common methodology for the conduction of meta-analysis (i.e., systematic review and data extraction, parameterisation of effect size, estimation of overall effect size, assessment of heterogeneity, and presentation of results) is explained by reviewing two meta-analyses derived from separate sets of primary studies of Salmonella in pork. Integrating different primary studies, the first meta-analysis elucidated for the first time a relationship between the proportion of Salmonella-carrier slaughter pigs entering the slaughter lines and the resulting proportion of contaminated carcasses at the point of evisceration; finding that the individual studies on their own could not reveal. On the other hand, the second application showed that meta-analysis can be used to estimate the overall effect of a critical process stage (chilling) on the incidence of the pathogen under study. The derivation of a relationship between variables and a probabilistic distribution is illustrations of the valuable quantitative information synthesised by the meta-analytical tools, which can be incorporated in risk assessment modelling. Strengths and weaknesses of meta-analysis within the context of food safety are also discussed.

  20. STRMDEPL08 - An extended version of STRMDEPL with additional analytical solutions to calculate streamflow depletion by nearby pumping wells

    USGS Publications Warehouse

    Reeves, Howard W.

    2008-01-01

    STRMDEPL, a one-dimensional model using two analytical solutions to calculate streamflow depletion by a nearby pumping well, was extended to account for two additional analytical solutions. The extended program is named STRMDEPL08. The original program incorporated solutions for a stream that fully penetrates the aquifer with and without streambed resistance to ground-water flow. The modified program includes solutions for a partially penetrating stream with streambed resistance and for a stream in an aquitard subjected to pumping from an underlying leaky aquifer. The code also was modified to allow the user to input pumping variations at other than 1-day intervals. The modified code is shown to correctly evaluate the analytical solutions and to provide correct results for half-day time intervals.

  1. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  2. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  3. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  4. Common plants as alternative analytical tools to monitor heavy metals in soil

    PubMed Central

    2012-01-01

    Background Herbaceous plants are common vegetal species generally exposed, for a limited period of time, to bioavailable environmental pollutants. Heavy metals contamination is the most common form of environmental pollution. Herbaceous plants have never been used as natural bioindicators of environmental pollution, in particular to monitor the amount of heavy metals in soil. In this study, we aimed at assessing the usefulness of using three herbaceous plants (Plantago major L., Taraxacum officinale L. and Urtica dioica L.) and one leguminous (Trifolium pratense L.) as alternative indicators to evaluate soil pollution by heavy metals. Results We employed Inductively Coupled Plasma Atomic Emission Spectroscopy (ICP-AES) to assess the concentration of selected heavy metals (Cu, Zn, Mn, Pb, Cr and Pd) in soil and plants and we employed statistical analyses to describe the linear correlation between the accumulation of some heavy metals and selected vegetal species. We found that the leaves of Taraxacum officinale L. and Trifolium pratense L. can accumulate Cu in a linearly dependent manner with Urtica dioica L. representing the vegetal species accumulating the highest fraction of Pb. Conclusions In this study we demonstrated that common plants can be used as an alternative analytical tool for monitoring selected heavy metals in soil. PMID:22594441

  5. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data).

  6. Introducing diffusing wave spectroscopy as a process analytical tool for pharmaceutical emulsion manufacturing.

    PubMed

    Reufer, Mathias; Machado, Alexandra H E; Niederquell, Andreas; Bohnenblust, Katharina; Müller, Beat; Völker, Andreas Charles; Kuentz, Martin

    2014-12-01

    Emulsions are widely used for pharmaceutical, food, and cosmetic applications. To guarantee that their critical quality attributes meet specifications, it is desirable to monitor the emulsion manufacturing process. However, finding of a suitable process analyzer has so far remained challenging. This article introduces diffusing wave spectroscopy (DWS) as an at-line technique to follow the manufacturing process of a model oil-in-water pharmaceutical emulsion containing xanthan gum. The DWS results were complemented with mechanical rheology, microscopy analysis, and stability tests. DWS is an advanced light scattering technique that assesses the microrheology and in general provides information on the dynamics and statics of dispersions. The obtained microrheology results showed good agreement with those obtained with bulk rheology. Although no notable changes in the rheological behavior of the model emulsions were observed during homogenization, the intensity correlation function provided qualitative information on the evolution of the emulsion dynamics. These data together with static measurements of the transport mean free path (l*) correlated very well with the changes in droplet size distribution occurring during the emulsion homogenization. This study shows that DWS is a promising process analytical technology tool for development and manufacturing of pharmaceutical emulsions.

  7. Narrative health research: exploring big and small stories as analytical tools.

    PubMed

    Sools, Anneke

    2013-01-01

    In qualitative health research many researchers use a narrative approach to study lay health concepts and experiences. In this article, I explore the theoretical linkages between the concepts narrative and health, which are used in a variety of ways. The article builds on previous work that conceptualizes health as a multidimensional, positive, dynamic and morally dilemmatic yet meaningful practice. I compare big and small stories as analytical tools to explore what narrative has to offer to address, nuance and complicate five challenges in narrative health research: (1) the interplay between health and other life issues; (2) the taken-for-granted yet rare character of the experience of good health; (3) coherence or incoherence as norms for good health; (4) temporal issues; (5) health as moral practice. In this article, I do not present research findings per se; rather, I use two interview excerpts for methodological and theoretical reflections. These interview excerpts are derived from a health promotion study in the Netherlands, which was partly based on peer-to-peer interviews. I conclude with a proposal to advance narrative health research by sensitizing researchers to different usages of both narrative and health, and the interrelationship(s) between the two.

  8. Net analyte signal standard addition method (NASSAM) as a novel spectrofluorimetric and spectrophotometric technique for simultaneous determination, application to assay of melatonin and pyridoxine

    NASA Astrophysics Data System (ADS)

    Asadpour-Zeynali, Karim; Bastami, Mohammad

    2010-02-01

    In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.

  9. Net analyte signal standard addition method for simultaneous determination of sulphadiazine and trimethoprim in bovine milk and veterinary medicines.

    PubMed

    Hajian, Reza; Mousavi, Esmat; Shams, Nafiseh

    2013-06-01

    Net analyte signal standard addition method has been used for the simultaneous determination of sulphadiazine and trimethoprim by spectrophotometry in some bovine milk and veterinary medicines. The method combines the advantages of standard addition method with the net analyte signal concept which enables the extraction of information concerning a certain analyte from spectra of multi-component mixtures. This method has some advantages such as the use of a full spectrum realisation, therefore it does not require calibration and prediction step and only a few measurements require for the determination. Cloud point extraction based on the phenomenon of solubilisation used for extraction of sulphadiazine and trimethoprim in bovine milk. It is based on the induction of micellar organised media by using Triton X-100 as an extraction solvent. At the optimum conditions, the norm of NAS vectors increased linearly with concentrations in the range of 1.0-150.0 μmolL(-1) for both sulphadiazine and trimethoprim. The limits of detection (LOD) for sulphadiazine and trimethoprim were 0.86 and 0.92 μmolL(-1), respectively.

  10. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be... used as part of an analysis, an assessment of the predictive capabilities of the fire models must...

  11. An Internet compendium of analytical methods and spectroscopic information for monomers and additives used in food packaging plastics.

    PubMed

    Gilbert, J; Simoneau, C; Cote, D; Boenke, A

    2000-10-01

    An internet website (http:¿cpf.jrc.it/smt/) has been produced as a means of dissemination of methods of analysis and supporting spectroscopic information on monomers and additives used for food contact materials (principally packaging). The site which is aimed primarily at assisting food control laboratories in the European Union contains analytical information on monomers, starting substances and additives used in the manufacture of plastics materials. A searchable index is provided giving PM and CAS numbers for each of 255 substances. For each substance a data sheet gives regulatory information, chemical structures, physico-chemical information and background information on the use of the substance in particular plastics, and the food packaging applications. For monomers and starting substances (155 compounds) the infra-red and mass spectra are provided, and for additives (100 compounds); additionally proton NMR are available for about 50% of the entries. Where analytical methods have been developed for determining these substances as residual amounts in plastics or as trace amounts in food simulants these methods are also on the website. All information is provided in portable document file (PDF) format which means that high quality copies can be readily printed, using freely available Adobe Acrobat Reader software. The website will in future be maintained and up-dated by the European Commission's Joint Research Centre (JRC) as new substances are authorized for use by the European Commission (DG-ENTR formerly DGIII). Where analytical laboratories (food control or other) require reference substances these can be obtained free-of-charge from a reference collection housed at the JRC and maintained in conjunction with this website compendium.

  12. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  13. Molecularly imprinted polymers: an analytical tool for the determination of benzimidazole compounds in water samples.

    PubMed

    Cacho, Carmen; Turiel, Esther; Pérez-Conde, Concepción

    2009-05-15

    Molecularly imprinted polymers (MIPs) for benzimidazole compounds have been synthesized by precipitation polymerization using thiabendazole (TBZ) as template, methacrylic acid as functional monomer, ethyleneglycol dimethacrylate (EDMA) and divinylbenzene (DVB) as cross-linkers and a mixture of acetonitrile and toluene as porogen. The experiments carried out by molecularly imprinted solid phase extraction (MISPE) in cartridges demonstrated the imprint effect in both imprinted polymers. MIP-DVB enabled a much higher breakthrough volume than MIP-EDMA, and thus was selected for further experiments. The ability of this MIP for the selective recognition of other benzimidazole compounds (albendazole, benomyl, carbendazim, fenbendazole, flubendazole and fuberidazole) was evaluated. The obtained results revealed the high selectivity of the imprinted polymer towards all the selected benzimidazole compounds. An off-line analytical methodology based on a MISPE procedure has been developed for the determination of benzimidazole compounds in tap, river and well water samples at concentration levels below the legislated maximum concentration levels (MCLs) with quantitative recoveries. Additionally, an on-line preconcentration procedure based on the use of a molecularly imprinted polymer as selective stationary phase in HPLC is proposed as a fast screening method for the evaluation of the presence of benzimidazole compounds in water samples.

  14. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  15. Repurposing mainstream CNC machine tools for laser-based additive manufacturing

    NASA Astrophysics Data System (ADS)

    Jones, Jason B.

    2016-04-01

    The advent of laser technology has been a key enabler for industrial 3D printing, known as Additive Manufacturing (AM). Despite its commercial success and unique technical capabilities, laser-based AM systems are not yet able to produce parts with the same accuracy and surface finish as CNC machining. To enable the geometry and material freedoms afforded by AM, yet achieve the precision and productivity of CNC machining, hybrid combinations of these two processes have started to gain traction. To achieve the benefits of combined processing, laser technology has been integrated into mainstream CNC machines - effectively repurposing them as hybrid manufacturing platforms. This paper reviews how this engineering challenge has prompted beam delivery innovations to allow automated changeover between laser processing and machining, using standard CNC tool changers. Handling laser-processing heads using the tool changer also enables automated change over between different types of laser processing heads, further expanding the breadth of laser processing flexibility in a hybrid CNC. This paper highlights the development, challenges and future impact of hybrid CNCs on laser processing.

  16. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  17. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  18. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  19. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    PubMed Central

    2012-01-01

    Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc) that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics. PMID:23153033

  20. HRMAS NMR spectroscopy combined with chemometrics as an alternative analytical tool to control cigarette authenticity.

    PubMed

    Shintu, Laetitia; Caldarelli, Stefano; Campredon, Mylène

    2013-11-01

    In this paper, we present for the first time the use of high-resolution magic angle spinning nuclear magnetic resonance (HRMAS NMR) spectroscopy combined with chemometrics as an alternative tool for the characterization of tobacco products from different commercial international brands as well as for the identification of counterfeits. Although cigarette filling is a very complex chemical mixture, we were able to discriminate between dark, bright, and additive-free cigarette blends belonging to six different filter-cigarette brands, commercially available, using an approach for which no extraction procedure is required. Second, we focused our study on a specific worldwide-distributed brand for which established counterfeits were available. We discriminated those from their genuine counterparts with 100% accuracy using unsupervised multivariate statistical analysis. The counterfeits that we analyzed showed a higher amount of nicotine and solanesol and a lower content of sugars, all endogenous tobacco leaf metabolites. This preliminary study demonstrates the great potential of HRMAS NMR spectroscopy to help in controlling cigarette authenticity.

  1. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    PubMed Central

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  2. The Global War on Terrorism: Analytical Support, Tools and Metrics of Assessment. MORS Workshop

    DTIC Science & Technology

    2005-08-11

    Metrics of Assessment (Working Group 3) The accompanying Excel workbook contains two worksheets . The first is a Tools versus Questions worksheet and the...emphasis on transnational actors. have similar missions with respect to cri - describing the success or failure to support Academics, US government...sheet tools, GIS; Microsoft Project show great promise "• Encourage MORS Sponsors to contact the various agencies to find out what tools and

  3. The efficacy of violence prediction: a meta-analytic comparison of nine risk assessment tools.

    PubMed

    Yang, Min; Wong, Stephen C P; Coid, Jeremy

    2010-09-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their predictive efficacies for violence. The effect sizes were extracted from 28 original reports published between 1999 and 2008, which assessed the predictive accuracy of more than one tool. We used a within-subject design to improve statistical power and multilevel regression models to disentangle random effects of variation between studies and tools and to adjust for study features. All 9 tools and their subscales predicted violence at about the same moderate level of predictive efficacy with the exception of Psychopathy Checklist--Revised (PCL-R) Factor 1, which predicted violence only at chance level among men. Approximately 25% of the total variance was due to differences between tools, whereas approximately 85% of heterogeneity between studies was explained by methodological features (age, length of follow-up, different types of violent outcome, sex, and sex-related interactions). Sex-differentiated efficacy was found for a small number of the tools. If the intention is only to predict future violence, then the 9 tools are essentially interchangeable; the selection of which tool to use in practice should depend on what other functions the tool can perform rather than on its efficacy in predicting violence. The moderate level of predictive accuracy of these tools suggests that they should not be used solely for some criminal justice decision making that requires a very high level of accuracy such as preventive detention.

  4. Complex source beam: A tool to describe highly focused vector beams analytically

    SciTech Connect

    Orlov, S.; Peschel, U.

    2010-12-15

    The scalar-complex-source model is used to develop an accurate description of highly focused radially, azimuthally, linearly, and circularly polarized monochromatic vector beams. We investigate the power and full beam widths at half maximum of vigorous Maxwell equation solutions. The analytical expressions are employed to compare the vector complex source beams with the real beams produced by various high-numerical-aperture (NA) focusing systems. We find a parameter set for which the spatial extents of the analytical beams are the same as those of experimentally realized ones. We ensure the same shape of the considered beams by investigating an overlap of the complex source beams with high-NA beams. We demonstrate that the analytical expressions are good approximations for realistic highly focused beams.

  5. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  6. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  7. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  8. Tracking and Visualizing Student Effort: Evolution of a Practical Analytics Tool for Staff and Student Engagement

    ERIC Educational Resources Information Center

    Nagy, Robin

    2016-01-01

    There is an urgent need for our educational system to shift assessment regimes from a narrow, high-stakes focus on grades, to more holistic definitions that value the qualities that lifelong learners will need. The challenge for learning analytics in this context is to deliver actionable assessments of these hard-to-quantify qualities, valued by…

  9. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    in database and data warehousing, data mining and machine learning, risk analysis and optimization, as well as applied analytics. Practitioners...analyzing historical time series data to provide insights regarding future decisions. • Data mining – which involves mining transactional data bases...reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of

  10. Adsorptive micro-extraction techniques--novel analytical tools for trace levels of polar solutes in aqueous media.

    PubMed

    Neng, N R; Silva, A R M; Nogueira, J M F

    2010-11-19

    A novel enrichment technique, adsorptive μ-extraction (AμE), is proposed for trace analysis of polar solutes in aqueous media. The preparation, stability tests and development of the analytical devices using two geometrical configurations, i.e. bar adsorptive μ-extraction (BAμE) and multi-spheres adsorptive μ-extraction (MSAμE) is fully discussed. From the several sorbent materials tested, activated carbons and polystyrene divinylbenzene phases demonstrated the best stability, robustness and to be the most suitable for analytical purposes. The application of both BAμE and MSAμE devices proved remarkable performance for the determination of trace levels of polar solutes and metabolites (e.g. pesticides, disinfection by-products, drugs of abuse and pharmaceuticals) in water matrices and biological fluids. By comparing AμE techniques with stir bar sorptive extraction based on polydimethylsiloxane phase, great effectiveness is attained overcoming the limitations of the latter enrichment approach regarding the more polar solutes. Furthermore, convenient sensitivity and selectivity is reached through AμE techniques, since the great advantage of this new analytical technology is the possibility to choose the most suitable sorbent to each particular type of application. The enrichment techniques proposed are cost-effective, easy to prepare and work-up, demonstrating robustness and to be a remarkable analytical tool for trace analysis of priority solutes in areas of recognized importance such as environment, forensic and other related life sciences.

  11. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  12. Redundancy in neutron activation analysis: A valuable tool in assuring analytical quality

    SciTech Connect

    Greenberg, R.R.

    1996-12-31

    Neutron activation analysis (NAA) has become widely used and is extremely valuable for the certification of standard reference materials (SRMs) at the National Institute of Standards and Technology (NIST). This is due to a number of reasons. First, NAA has essentially no significant sources of error in common with the other analytical techniques used at NIST to measure inorganic concentrations. This is important because most certified elemental concentrations are derived from the data determined by two (and occasionally more) independent analytical techniques. Two or more techniques are used for SRM certification because, although each technique has previously been evaluated and shown to be accurate, unexpected problems can arise, especially when analyzing new matrices. Another reason for the use of NAA for SRM certification is the potential of this technique for accuracy. The SRM measurements with estimated accuracies of 1 to 2% (at essentially 95% confidence intervals) are routinely made at NIST using NAA.

  13. Interactive Poster: A Proposal for Sharing User Requirements for Visual Analytic Tools

    SciTech Connect

    Scholtz, Jean

    2009-10-11

    Although many in the community have advocated user-centered evaluations for visual analytic environments, a significant barrier exists. The users targeted by the visual analytics community (law enforcement personnel, professional information analysts, financial analysts, health care analysts, etc.) are often inaccessible to researchers. These analysts are extremely busy and their work environments and data are often classified or at least confidential. Furthermore, their tasks often last weeks or even months. It is simply not feasible to do such long-term observations to understand their jobs. How then can we hope to gather enough information about the diverse user populations to understand their needs? Some researchers have been successful in working with different end-users, including the author. A reasonable approach, therefore, would be to find a way to share user information. This paper outlines a proposal for developing a handbook of user profiles for use by researchers, developers, and evaluators.

  14. DNA-only cascade: a universal tool for signal amplification, enhancing the detection of target analytes.

    PubMed

    Bone, Simon M; Hasick, Nicole J; Lima, Nicole E; Erskine, Simon M; Mokany, Elisa; Todd, Alison V

    2014-09-16

    Diagnostic tests performed in the field or at the site of patient care would benefit from using a combination of inexpensive, stable chemical reagents and simple instrumentation. Here, we have developed a universal "DNA-only Cascade" (DoC) to quantitatively detect target analytes with increased speed. The DoC utilizes quasi-circular structures consisting of temporarily inactivated deoxyribozymes (DNAzymes). The catalytic activity of the DNAzymes is restored in a universal manner in response to a broad range of environmental and biological targets. The present study demonstrates DNAzyme activation in the presence of metal ions (Pb(2+)), small molecules (deoxyadenosine triphosphate) and nucleic acids homologous to genes from Meningitis-causing bacteria. Furthermore, DoC efficiently discriminates nucleic acid targets differing by a single nucleotide. When detection of analytes is orchestrated by functional nucleic acids, the inclusion of DoC reagents substantially decreases time for detection and allows analyte quantification. The detection of nucleic acids using DoC was further characterized for its capability to be multiplexed and retain its functionality following long-term exposure to ambient temperatures and in a background of complex medium (human serum).

  15. The Facial Aesthetic index: An additional tool for assessing treatment need

    PubMed Central

    Sundareswaran, Shobha; Ramakrishnan, Ranjith

    2016-01-01

    Objectives: Facial Aesthetics, a major consideration in orthodontic diagnosis and treatment planning, may not be judged correctly and completely by simply analyzing dental occlusion or osseous structures. Despite this importance, there is no index to guarantee availability of treatment or prioritize patients based on their soft tissue treatment needs. Individuals having well-aligned teeth but unaesthetic convex profiles do not get included for treatment as per current malocclusion indices. The aim of this investigation is to develop an aesthetic index based on facial profiles which could be used as an additional tool with malocclusion indices. Materials and Methods: A chart showing typical facial profile changes due to underlying malocclusions was generated by soft tissue manipulations of standardized profile photographs of a well-balanced male and female face. A panel of 62 orthodontists judged the profile photographs of 100 patients with different soft tissue patterns for assessing profile variations and treatment need. The index was later tested in a cross-section of school population. Statistical analysis was done using “irr” package of R environment version 2.15.1. Results: The index exhibited very good reliability in determining profile variations (Fleiss kappa 0.866, P < 0.001), excellent reproducibility (kappa 0.9078), high sensitivity, and specificity (95.7%). Testing in population yielded excellent agreement among orthodontists (kappa 0.9286). Conclusions: A new Facial Aesthetic index, based on patient's soft tissue profile requirements is proposed, which can complement existing indices to ensure treatment to those in need. PMID:27127752

  16. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  17. The use of Permeation Liquid Membrane (PLM) as an analytical tool for trace metal speciation studies in natural waters

    NASA Astrophysics Data System (ADS)

    Parthasarathy, N.; Pelletier, M.; Buffle, J.

    2003-05-01

    Permeation liquid membrane (PLM) based on liquid-liquid extraction principles is an emerging analytical tool for making in situ trace metal speciation measurements. A PLM comprising didecyl 1, 10 diaza crown etherlauric acid in phenylhexane/toluene has been developed for measuring free metal ions (e.g. Cu, Pb, Cd and Zn) concentration under natural water conditions. The capability of PLM for making speciation studies has been demonstrated using synthetic and natural ligands. Application of in situ preconcentration of trace metals in diverse waters using specially designed hollow fibre PLM are reported.

  18. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  19. Segmented post-column analyte addition; a concept for continuous response control of liquid chromatography/mass spectrometry peaks affected by signal suppression/enhancement.

    PubMed

    Kaufmann, Anton; Butcher, Patrick

    2005-01-01

    A novel technique, "segmented post-column analyte addition", is proposed to visualize and compensate signal suppression/enhancement effects in electrospray ionization tandem mass spectrometry (ESI-MS/MS). Instead of delivering a constant flow of analyte solution between the liquid chromatography (LC) column exit and the ESI interface into the eluent resulting from LC separation of analyte-free matrix in order to determine retention time widows in which suppression/enhancement is unimportant (King et al., J. Am. Soc. Mass Spectrom. 2000; 11: 942), segmented packets of analyte-containing solvent and analyte-free solvent were infused into an LC eluent resulting from separation of an analyte-containing sample. The obtained, superimposed, periodic spikes are much narrower than the analyte peak eluting from the column. The height of the spikes is affected by signal suppression phenomena to the same extent as the analyte signal, and hence variations of the spike height can be used to correct the peak area of analyte peaks affected by signal suppression/enhancement.

  20. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  1. ICL-Based OF-CEAS: A Sensitive Tool for Analytical Chemistry.

    PubMed

    Manfred, Katherine M; Hunter, Katharine M; Ciaffoni, Luca; Ritchie, Grant A D

    2017-01-03

    Optical-feedback cavity-enhanced absorption spectroscopy (OF-CEAS) using mid-infrared interband cascade lasers (ICLs) is a sensitive technique for trace gas sensing. The setup of a V-shaped optical cavity operating with a 3.29 μm cw ICL is detailed, and a quantitative characterization of the injection efficiency, locking stability, mode matching, and detection sensitivity is presented. The experimental data are supported by a model to show how optical feedback affects the laser frequency as it is scanned across several longitudinal modes of the optical cavity. The model predicts that feedback enhancement effects under strongly absorbing conditions can cause underestimations in the measured absorption, and these predictions are verified experimentally. The technique is then used in application to the detection of nitrous oxide as an exemplar of the utility of this technique for analytical gas phase spectroscopy. The analytical performance of the spectrometer, expressed as noise equivalent absorption coefficient, was estimated as 4.9 × 10(-9) cm (-1) Hz(-1/2), which compares well with recently reported values.

  2. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  3. Electrochemical treatment of olive mill wastewater: treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools.

    PubMed

    Belaid, Chokri; Khadraoui, Moncef; Mseddii, Salma; Kallel, Monem; Elleuch, Boubaker; Fauvarque, Jean Frangois

    2013-01-01

    Problems related with industrials effluents can be divided in two parts: (1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes. This investigation deals with these two aspects, an electrochemical treatment method of an olive mill wastewater (OMW) under platinized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination. The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution. Indeed, 87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation. Moreover, 55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced. On the other hand, UV-Visible spectrophotometry, Gaz chromatography/mass spectrometry, cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR) showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW. It was concluded that electrochemical oxidation in a modified Grignard reactor is a promising process for the destruction of all phenolic compounds present in OMW. Among the monitoring analytical tools applied, cyclic voltammetry and 13C NMR a re among th e techniques that are introduced for thefirst time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  4. Drugs on the internet, part IV: Google's Ngram viewer analytic tool applied to drug literature.

    PubMed

    Montagne, Michael; Morgan, Melissa

    2013-04-01

    Google Inc.'s digitized book library can be searched based on key words and phrases over a five-century time frame. Application of the Ngram Viewer to drug literature was assessed for its utility as a research tool. The results appear promising as a method for noting changes in the popularity of specific drugs over time, historical epidemiology of drug use and misuse, and adoption and regulation of drug technologies.

  5. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  7. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  8. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  9. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  10. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  11. ELISA and GC-MS as Teaching Tools in the Undergraduate Environmental Analytical Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Wilson, Ruth I.; Mathers, Dan T.; Mabury, Scott A.; Jorgensen, Greg M.

    2000-12-01

    An undergraduate experiment for the analysis of potential water pollutants is described. Students are exposed to two complementary techniques, ELISA and GC-MS, for the analysis of a water sample containing atrazine, desethylatrazine, and simazine. Atrazine was chosen as the target analyte because of its wide usage in North America and its utility for students to predict environmental degradation products. The water sample is concentrated using solid-phase extraction for GC-MS, or diluted and analyzed using a competitive ELISA test kit for atrazine. The nature of the water sample is such that students generally find that ELISA gives an artificially high value for the concentration of atrazine. Students gain an appreciation for problems associated with measuring pollutants in the aqueous environment: sensitivity, accuracy, precision, and ease of analysis. This undergraduate laboratory provides an opportunity for students to learn several new analysis and sample preparation techniques and to critically evaluate these methods in terms of when they are most useful.

  12. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  13. Improving the analytical performance of hydride generation non-dispersive atomic fluorescence spectrometry. Combined effect of additives and optical filters

    NASA Astrophysics Data System (ADS)

    D'Ulivo, Alessandro; Bramanti, Emilia; Lampugnani, Leonardo; Zamboni, Roberto

    2001-10-01

    The effects of tetrahydroborate and acid concentration and the presence of L-cysteine and thiourea were investigated in the determination of As, Bi and Sn using continuous flow hydride generation atomic fluorescence spectrometry (HG AFS). The aim was to find conditions allowing the control of those effects exerting negative influence on the analytical performance of the HG AFS apparatus. The effects taken into account were: (i) the radiation scattering generated by carryover of solution from the gas-liquid separator to the atomizer; (ii) the introduction of molecular species generated by tetrahydroborate decomposition into the atomizer; and (iii) interference effects arising from other elements in the sample matrix and from different acids. The effects (i) and (ii) could be controlled using mild reaction conditions in the HG stage. The effect of HG conditions on carryover was studied by radiation scattering experiments without hydride atomization. Compromised HG conditions were found by studying the effects of tetrahydroborate (0.1-20 g l -1) and acid (0.01-7 mol l -1) concentration, and the addition of L-cysteine (10 g l -1) and thiourea (0.1 mol l -1) on the HG AFS signals. The effect of optical filters was investigated with the aim of improving the signal-to-noise ratio. Optical filters with peak wavelengths of 190 and 220 nm provided an improvement of detection limits by factors of approximately 4 and 2 for As and Te, respectively. Under optimized conditions the detection limits were 6, 5, 3, 2, 2 and 9 ng l -1 for As, Sb, Bi, Sn, Se and Te, respectively. Good tolerance to various acid compositions and sample matrices was obtained by using L-cysteine or thiourea as masking agents. Determination of arsenic in sediment and copper certified reference materials, and of bismuth in steel, sediment, soil and ore certified reference material is reported.

  14. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    SciTech Connect

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O'Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of

  15. Web-based analytical tools for the exploration of spatial data

    NASA Astrophysics Data System (ADS)

    Anselin, Luc; Kim, Yong Wook; Syabri, Ibnu

    This paper deals with the extension of internet-based geographic information systems with functionality for exploratory spatial data analysis (esda). The specific focus is on methods to identify and visualize outliers in maps for rates or proportions. Three sets of methods are included: extreme value maps, smoothed rate maps and the Moran scatterplot. The implementation is carried out by means of a collection of Java classes to extend the Geotools open source mapping software toolkit. The web based spatial analysis tools are illustrated with applications to the study of homicide rates and cancer rates in U.S. counties.

  16. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  17. A Cognitive Tool for Teaching the Addition/Subtraction of Common Fractions: A Model of Affordances

    ERIC Educational Resources Information Center

    Kong, Siu Cheung; Kwok, Lam For

    2005-01-01

    The aim of this research is to devise a cognitive tool for meeting the diverse needs of learners for comprehending new procedural knowledge. A model of affordances on teaching fraction equivalence for developing procedural knowledge for adding/subtracting fractions with unlike denominators was derived from the results of a case study of an initial…

  18. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future.

  19. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  20. Recognizing ancient papyri by a combination of spectroscopic, diffractional and chromatographic analytical tools.

    PubMed

    Łojewska, J; Rabin, I; Pawcenis, D; Bagniuk, J; Aksamit-Koperska, M A; Sitarz, M; Missori, M; Krutzsch, M

    2017-04-06

    Ancient papyri are a written heritage of culture that flourished more than 3000 years ago in Egypt. One of the most significant collections in the world is housed in the Egyptian Museum and Papyrus Collection in Berlin, from where the samples for our investigation come. The papyrologists, curators and conservators of such collections search intensely for the analytical detail that would allow ancient papyri to be distinguished from modern fabrications, in order to detect possible forgeries, assess papyrus deterioration state, and improve the design of storage conditions and conservation methods. This has become the aim of our investigation. The samples were studied by a number of methods, including spectroscopic (FTIR, fluorescent-FS, Raman) diffractional (XRD) and chromatographic (size exclusion chromatography-SEC), selected in order to determine degradation parameters: overall oxidation of lignocellulosic material, degree of polymerization and crystallinity of cellulose. The results were correlated with those obtained from carefully selected model samples including modern papyri and paper of different composition aged at elevated temperature in humid air. The methods were classified in the order SEC > FS > FTIR > XRD, based on their effectiveness in discriminating the state of papyri degradation. However, the most trustworthy evaluation of the age of papyri samples should rely on several methods.

  1. Magnetic optical sensor particles: a flexible analytical tool for microfluidic devices.

    PubMed

    Ungerböck, Birgit; Fellinger, Siegfried; Sulzer, Philipp; Abel, Tobias; Mayr, Torsten

    2014-05-21

    In this study we evaluate magnetic optical sensor particles (MOSePs) with incorporated sensing functionalities regarding their applicability in microfluidic devices. MOSePs can be separated from the surrounding solution to form in situ sensor spots within microfluidic channels, while read-out is accomplished outside the chip. These magnetic sensor spots exhibit benefits of sensor layers (high brightness and convenient usage) combined with the advantages of dispersed sensor particles (ease of integration). The accumulation characteristics of MOSePs with different diameters were investigated as well as the in situ sensor spot stability at varying flow rates. Magnetic sensor spots were stable at flow rates specific to microfluidic applications. Furthermore, MOSePs were optimized regarding fiber optic and imaging read-out systems, and different referencing schemes were critically discussed on the example of oxygen sensors. While the fiber optic sensing system delivered precise and accurate results for measurement in microfluidic channels, limitations due to analyte consumption were found for microscopic oxygen imaging. A compensation strategy is provided, which utilizes simple pre-conditioning by exposure to light. Finally, new application possibilities were addressed, being enabled by the use of MOSePs. They can be used for microscopic oxygen imaging in any chip with optically transparent covers, can serve as flexible sensor spots to monitor enzymatic activity or can be applied to form fixed sensor spots inside microfluidic structures, which would be inaccessible to integration of sensor layers.

  2. Analytical tools employed to determine pharmaceutical compounds in wastewaters after application of advanced oxidation processes.

    PubMed

    Afonso-Olivares, Cristina; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Santana-Rodríguez, José Juan

    2016-12-01

    Today, the presence of contaminants in the environment is a topic of interest for society in general and for the scientific community in particular. A very large amount of different chemical substances reaches the environment after passing through wastewater treatment plants without being eliminated. This is due to the inefficiency of conventional removal processes and the lack of government regulations. The list of compounds entering treatment plants is gradually becoming longer and more varied because most of these compounds come from pharmaceuticals, hormones or personal care products, which are increasingly used by modern society. As a result of this increase in compound variety, to address these emerging pollutants, the development of new and more efficient removal technologies is needed. Different advanced oxidation processes (AOPs), especially photochemical AOPs, have been proposed as supplements to traditional treatments for the elimination of pollutants, showing significant advantages over the use of conventional methods alone. This work aims to review the analytical methodologies employed for the analysis of pharmaceutical compounds from wastewater in studies in which advanced oxidation processes are applied. Due to the low concentrations of these substances in wastewater, mass spectrometry detectors are usually chosen to meet the low detection limits and identification power required. Specifically, time-of-flight detectors are required to analyse the by-products.

  3. Recognizing ancient papyri by a combination of spectroscopic, diffractional and chromatographic analytical tools

    PubMed Central

    Łojewska, J.; Rabin, I.; Pawcenis, D.; Bagniuk, J.; Aksamit-Koperska, M. A.; Sitarz, M.; Missori, M.; Krutzsch, M.

    2017-01-01

    Ancient papyri are a written heritage of culture that flourished more than 3000 years ago in Egypt. One of the most significant collections in the world is housed in the Egyptian Museum and Papyrus Collection in Berlin, from where the samples for our investigation come. The papyrologists, curators and conservators of such collections search intensely for the analytical detail that would allow ancient papyri to be distinguished from modern fabrications, in order to detect possible forgeries, assess papyrus deterioration state, and improve the design of storage conditions and conservation methods. This has become the aim of our investigation. The samples were studied by a number of methods, including spectroscopic (FTIR, fluorescent-FS, Raman) diffractional (XRD) and chromatographic (size exclusion chromatography-SEC), selected in order to determine degradation parameters: overall oxidation of lignocellulosic material, degree of polymerization and crystallinity of cellulose. The results were correlated with those obtained from carefully selected model samples including modern papyri and paper of different composition aged at elevated temperature in humid air. The methods were classified in the order SEC > FS > FTIR > XRD, based on their effectiveness in discriminating the state of papyri degradation. However, the most trustworthy evaluation of the age of papyri samples should rely on several methods. PMID:28382971

  4. Puzzle test: A tool for non-analytical clinical reasoning assessment

    PubMed Central

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test’s format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning. PMID:28210603

  5. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    PubMed

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  6. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    PubMed

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-01-04

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter.

  7. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    PubMed

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes.

  8. Terahertz pulsed imaging, a novel process analytical tool to investigate the coating characteristics of push-pull osmotic systems.

    PubMed

    Malaterre, Vincent; Pedersen, Maireadh; Ogorka, Joerg; Gurny, Robert; Loggia, Nicoletta; Taday, Philip F

    2010-01-01

    The aim of this study was to investigate coating characteristics of push-pull osmotic systems (PPOS) using three-dimensional terahertz pulsed imaging (3D-TPI) and to detect physical alterations potentially impacting the drug release. The terahertz time-domain reflection signal was used to obtain information on both the spatial distribution of the coating thickness and the coating internal physical mapping. The results showed that (i) the thickness distribution of PPOS coating can be non-destructively analysed using 3D-TPI and (ii) internal physical alterations impacting the drug release kinetics were detectable by using the terahertz time-domain signal. Based on the results, the potential benefits of implementing 3D-TPI as quality control analytical tool were discussed.

  9. Development of a fast analytical tool to identify oil spillages employing infrared spectral indexes and pattern recognition techniques.

    PubMed

    Fresco-Rivera, P; Fernández-Varela, R; Gómez-Carracedo, M P; Ramírez-Villalobos, F; Prada, D; Muniategui, S; Andrade, J M

    2007-11-30

    A fast analytical tool based on attenuated total reflectance mid-IR spectrometry is presented to evaluate the origin of spilled hydrocarbons and to monitor their fate on the environment. Ten spectral band ratios are employed in univariate and multivariate studies (principal components analysis, cluster analysis, density functions - potential curves - and Kohonen self organizing maps). Two indexes monitor typical photooxidation processes, five are related to aromatic characteristics and three study aliphatic and branched chains. The case study considered here comprises 45 samples taken on beaches (from 2002 to 2005) after the Prestige carrier accident off the Galician coast and 104 samples corresponding to weathering studies deployed for the Prestige's fuel, four typical crude oils and a fuel oil. The univariate studies yield insightful views on the gross chemical evolution whereas the multivariate studies allow for simple and straightforward elucidations on whether the unknown samples match the Prestige's fuel. Besides, a good differentiation on the weathering patterns of light and heavy products is obtained.

  10. Analytical curve or standard addition method: how to elect and design--a strategy applied to copper determination in sugarcane spirits using AAS.

    PubMed

    Honorato, Fernanda Araujo; Honorato, Ricardo Saldanha; Pimentel, Maria Fernanda; Araujo, Mario Cesar Ugulino

    2002-11-01

    In most instrumental analysis, the analyte concentration is usually obtained the by Analytical Curve Method (ACM) or Standard Addition Method (SAM). Thus, it is important for the analyst to select the most appropriate method, to seek the best conditions of analysis, and to provide parameters of analytical performance. A strategy to do so is proposed in this paper in conjunction with MATLAB software to implement it. The proposed strategy was applied to copper determination by atomic absorption spectrometry in Brazilian sugarcane spirits termed 'Cachaça' and SAM was chosen as the most appropriate method. To select the best experimental design for SAM, the influence of some factors, such as the number of standard additions and concentration levels, the location of the levels and the average concentration of the standard additions were demonstrated. The design with six standard additions, four concentration levels located near the inferior and superior levels and the average concentration of the standard additions closer to zero yielded SAM with an adequate compromise between precision, cost and time of analysis. The uniform distribution of concentration levels, usually used in routine analysis, is not a good design regarding precision. On the other hand, it is adequate when the linear range is unknown. Generally, the proposed strategy can be applied to different instrumental techniques and samples, which aim to improve their analytical performance.

  11. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  12. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  13. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  14. V3 net charge: additional tool in HIV-1 tropism prediction.

    PubMed

    Montagna, Claudia; De Crignis, Elisa; Bon, Isabella; Re, Maria Carla; Mezzaroma, Ivano; Turriziani, Ombretta; Graziosi, Cecilia; Antonelli, Guido

    2014-12-01

    Genotype-based algorithms are valuable tools for the identification of patients eligible for CCR5 inhibitors administration in clinical practice. Among the available methods, geno2pheno[coreceptor] (G2P) is the most used online tool for tropism prediction. This study was conceived to assess if the combination of G2P prediction with V3 peptide net charge (NC) value could improve the accuracy of tropism prediction. A total of 172 V3 bulk sequences from 143 patients were analyzed by G2P and NC values. A phenotypic assay was performed by cloning the complete env gene and tropism determination was assessed on U87_CCR5(+)/CXCR4(+) cells. Sequences were stratified according to the agreement between NC values and G2P results. Of sequences predicted as X4 by G2P, 61% showed NC values higher than 5; similarly, 76% of sequences predicted as R5 by G2P had NC values below 4. Sequences with NC values between 4 and 5 were associated with different G2P predictions: 65% of samples were predicted as R5-tropic and 35% of sequences as X4-tropic. Sequences identified as X4 by NC value had at least one positive residue at positions known to be involved in tropism prediction and positive residues in position 32. These data supported the hypothesis that NC values between 4 and 5 could be associated with the presence of dual/mixed-tropic (DM) variants. The phenotypic assay performed on a subset of sequences confirmed the tropism prediction for concordant sequences and showed that NC values between 4 and 5 are associated with DM tropism. These results suggest that the combination of G2P and NC could increase the accuracy of tropism prediction. A more reliable identification of X4 variants would be useful for better selecting candidates for Maraviroc (MVC) administration, but also as a predictive marker in coreceptor switching, strongly associated with the phase of infection.

  15. Establishment of a reference collection of additives and an analytical handbook of reference data to support enforcement of EU regulations on food contact plastics.

    PubMed

    van Lierop, B; Castle, L; Feigenbaum, A; Ehlert, K; Boenke, A

    1998-10-01

    A collection has been made of additives that are required as analytical standards for enforcement of European Union legislation on food contact plastics. The 100 additives have been characterized by mass spectrometry, infra-red spectroscopy and proton nuclear magnetic resonance spectroscopy to provide reference spectra. Gas chromatographic retention times have been recorded to facilitate identification by retention index. This information has been further supplemented by physico-chemical data. Finally, chromatographic methods have been used to indicate the presence of any impurities in the commercial chemicals. Samples of the reference substances are available on request and the collection of spectra and other information will be made available in printed format and on-line through the Internet. This paper gives an overview of the work done to establish the reference collection and the spectral atlas, which together will assist enforcement laboratories in the characterization of plastics and the selection of analytical methods for additives that may migrate.

  16. Coordinate swapping in standard addition graphs for analytical chemistry: a simplified path for uncertainty calculation in linear and nonlinear plots.

    PubMed

    Meija, Juris; Pagliano, Enea; Mester, Zoltán

    2014-09-02

    Uncertainty of the result from the method of standard addition is often underestimated due to neglect of the covariance between the intercept and the slope. In order to simplify the data analysis from standard addition experiments, we propose x-y coordinate swapping in conventional linear regression. Unlike the ratio of the intercept and slope, which is the result of the traditional method of standard addition, the result of the inverse standard addition is obtained directly from the intercept of the swapped calibration line. Consequently, the uncertainty evaluation becomes markedly simpler. The method is also applicable to nonlinear curves, such as the quadratic model, without incurring any additional complexity.

  17. Evaluation of analytical tools and multivariate methods for quantification of co-former crystals in ibuprofen-nicotinamide co-crystals.

    PubMed

    Soares, Frederico Luis Felipe; Carneiro, Renato Lajarim

    2014-02-01

    Co-crystals are multicomponent substances designed by the addition of two or more different molecules in a same crystallographic pattern, in which it differs from the crystallographic motif of its co-formers. The addition of highly soluble molecules, like nicotinamide, in the crystallographic pattern of ibuprofen enhances its solubility more than 7.5 times, improving the properties of this widely used drug. Several analytical solid state techniques are used to characterize the ibuprofen-nicotinamide co-crystal, being the most used: mid-infrared (ATR-FTIR), differential scanning calorimetry (DSC), X-ray diffraction (XRPD) and Raman spectroscopy. These analytical solid state techniques were evaluated to quantify a mixture of ibuprofen-nicotinamide co-crystal and its co-formers in order to develop a calibration model to evaluate the co-crystal purity after its synthesis. Raman spectroscopy showed better result than all other techniques with a combination of multivariate calibration tools, presenting lower values of calibration and prediction errors. The partial least squares regression model gave a mean error lower than 5% for all components presented in the mixture. DSC and mid-infrared spectroscopy proved to be insufficient for quantification of the ternary mixture. XRPD presented good results for quantification of the co-formers, ibuprofen and nicotinamide, but fair results for the co-crystal. This is the first report of quantification of ibuprofen-nicotinamide co-crystal, among its co-formers. The quantification is of great importance to determine the yield of the co-crystallization reactions and the purity of the product obtained.

  18. The role of methanol addition to water samples in reducing analyte adsorption and matrix effects in liquid chromatography-tandem mass spectrometry.

    PubMed

    Li, Wei; Liu, Yucan; Duan, Jinming; Saint, Christopher P; Mulcahy, Dennis

    2015-04-10

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) analysis coupled simply with water filtering before injection has proven to be a simple, economic and time-saving method for analyzing trace-level organic pollutants in aqueous environments. However, the linearity, precision and detection limits of such methods for late-eluting analytes were found to be much poorer than for early-eluting ones due to adsorption of the analytes in the operating system, such as sample vial, flow path and sample loop, creating problems in quantitative analysis. Addition of methanol (MeOH) into water samples as a modifier was shown to be effective in alleviating or even eliminating the negative effect on signal intensity for the late-eluting analytes and at the same time being able to reduce certain matrix effects for real water samples. Based on the maximum detection signal intensity obtained on desorption of the analytes with MeOH addition, the ratio of the detection signal intensity without addition of MeOH to the maximum intensity can be used to evaluate the effectiveness of methanol addition. Accordingly, the values of <50%, 50-80%, 80-120% could be used to indicate strong, medium and no effects, respectively. Based on this concept, an external matrix-matched calibration method with the addition of MeOH has been successfully established for analyzing fifteen pesticides with diverse physico-chemical properties in surface and groundwater with good linearity (r(2): 0.9929-0.9996), precision (intra-day relative standard deviation (RSD): 1.4-10.7%, inter-day RSD: 1.5-9.4%), accuracy (76.9-126.7%) and low limits of detection (0.003-0.028μg/L).

  19. Alerting strategies in computerized physician order entry: a novel use of a dashboard-style analytics tool in a children's hospital.

    PubMed

    Reynolds, George; Boyer, Dean; Mackey, Kevin; Povondra, Lynne; Cummings, Allana

    2008-11-06

    Utilizing a commercially available business analytics tool offering dashboard-style graphical indicators and a data warehouse strategy, we have developed an interactive, web-based platform that allows near-real-time analysis of CPOE adoption by hospital area and practitioner specialty. Clinical Decision Support (CDS) metrics include the percentage of alerts that result in a change in clinician decision-making. This tool facilitates adjustments in alert limits in order to reduce alert fatigue.

  20. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  1. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  2. Recent Additions in the Modeling Capabilities of an Open-Source Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-04-20

    WEC-Sim is a midfidelity numerical tool for modeling wave energy conversion devices. The code uses the MATLAB SimMechanics package to solve multibody dynamics and models wave interactions using hydrodynamic coefficients derived from frequency-domain boundary-element methods. This paper presents the new modeling features introduced in the latest release of WEC-Sim. The first feature discussed conversion of the fluid memory kernel to a state-space form. This enhancement offers a substantial computational benefit after the hydrodynamic body-to-body coefficients are introduced and the number of interactions increases exponentially with each additional body. Additional features include the ability to calculate the wave-excitation forces based on the instantaneous incident wave angle, allowing the device to weathervane, as well as import a user-defined wave elevation time series. A review of the hydrodynamic theory for each feature is provided and the successful implementation is verified using test cases.

  3. Data-infilling in daily mean river flow records: first results using a visual analytics tool (gapIT)

    NASA Astrophysics Data System (ADS)

    Giustarini, Laura; Parisot, Olivier; Ghoniem, Mohammad; Trebs, Ivonne; Médoc, Nicolas; Faber, Olivier; Hostache, Renaud; Matgen, Patrick; Otjacques, Benoît

    2015-04-01

    Missing data in river flow records represent a loss of information and a serious drawback in water management. An incomplete time series prevents the computation of hydrological statistics and indicators. Also, records with data gaps are not suitable as input or validation data for hydrological or hydrodynamic modelling. In this work we present a visual analytics tool (gapIT), which supports experts to find the most adequate data-infilling technique for daily mean river flow records. The tool performs an automated calculation of river flow estimates using different data-infilling techniques. Donor station(s) are automatically selected based on Dynamic Time Warping, geographical proximity and upstream/downstream relationships. For each gap the tool computes several flow estimates through various data-infilling techniques, including interpolation, multiple regression, regression trees and neural networks. The visual application provides the possibility for the user to select different donor station(s) w.r.t. those automatically selected. The gapIT software was applied to 24 daily time series of river discharge recorded in Luxembourg over the period 01/01/2007 - 31/12/2013. The method was validated by randomly creating artificial gaps of different lengths and positions along the entire records. Using the RMSE and the Nash-Sutcliffe (NS) coefficient as performance measures, the method is evaluated based on a comparison with the actual measured discharge values. The application of the gapIT software to artificial gaps led to satisfactory results in terms of performance indicators (NS>0.8 for more than half of the artificial gaps). A case-by-case analysis revealed that the limited number of reconstructed record gaps characterized by a high RMSE values (NS>0.8) were caused by the temporary unavailability of the most appropriate donor station. On the other hand, some of the gaps characterized by a high accuracy of the reconstructed record were filled by using the data from

  4. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  5. Tracing the sources of refractory dissolved organic matter in a large artificial lake using multiple analytical tools.

    PubMed

    Nguyen, Hang Vo-Minh; Hur, Jin

    2011-10-01

    Structural and chemical characteristics of refractory dissolved organic matter (RDOM) from seven different sources (algae, leaf litter, reed, compost, field soil, paddy water, treated sewage) were examined using multiple analytical tools, and they were compared with those of RDOM in a large artificial lake (Lake Paldang, Korea). Treated sewage, paddy water, and field soil were distinguished from the other sources investigated by their relatively low specific UV absorbance (SUVA) values and more pronounced fulvic-like versus humic-like fluorescence of the RDOM samples. Microbial derived RDOM from algae and treated sewage showed relatively low apparent molecular weight and a higher fraction of hydrophilic bases relative to the total hydrophilic fraction. For the biopolymer types, the presence of polyhydroxy aromatics with the high abundance of proteins was observed only for vascular plant-based RDOM (i.e., leaf litter and reed). Molecular weight values exhibited positive correlations with the SUVA and the hydrophobic content among the different RDOM, suggesting that hydrophobic and condensed aromatic structures may be the main components of high molecular weight RDOM. Principal component analysis revealed that approximately 77% of the variance in the RDOM characteristics might be explained by the source difference (i.e., terrestrial and microbial derived) and a tendency of further microbial transformation. Combined results demonstrated that the properties of the lake RDOM were largely affected by the upstream sources of field soil, paddy water, and treated sewage, which are characterized by low molecular weight UV-absorbing and non-aromatic structures with relatively high resistance to further degradation.

  6. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    PubMed

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  7. Additional disturbances as a beneficial tool for restoration of post-mining sites: a multi-taxa approach.

    PubMed

    Řehounková, Klára; Čížek, Lukáš; Řehounek, Jiří; Šebelíková, Lenka; Tropek, Robert; Lencová, Kamila; Bogusch, Petr; Marhoul, Pavel; Máca, Jan

    2016-07-01

    Open interior sands represent a highly threatened habitat in Europe. In recent times, their associated organisms have often found secondary refuges outside their natural habitats, mainly in sand pits. We investigated the effects of different restoration approaches, i.e. spontaneous succession without additional disturbances, spontaneous succession with additional disturbances caused by recreational activities, and forestry reclamation, on the diversity and conservation values of spiders, beetles, flies, bees and wasps, orthopterans and vascular plants in a large sand pit in the Czech Republic, Central Europe. Out of 406 species recorded in total, 112 were classified as open sand specialists and 71 as threatened. The sites restored through spontaneous succession with additional disturbances hosted the largest proportion of open sand specialists and threatened species. The forestry reclamations, in contrast, hosted few such species. The sites with spontaneous succession without disturbances represent a transition between these two approaches. While restoration through spontaneous succession favours biodiversity in contrast to forestry reclamation, additional disturbances are necessary to maintain early successional habitats essential for threatened species and open sand specialists. Therefore, recreational activities seem to be an economically efficient restoration tool that will also benefit biodiversity in sand pits.

  8. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    SciTech Connect

    Bjoerklund, Anna

    2012-01-15

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: Black-Right-Pointing-Pointer LCA was explored as analytical tool in an SEA process of municipal energy planning. Black-Right-Pointing-Pointer The process also integrated LCA with scenario planning and public participation. Black-Right-Pointing-Pointer Benefits of using LCA were a systematic framework and wider systems perspective. Black-Right-Pointing-Pointer Integration of tools required some methodological challenges to be solved. Black-Right-Pointing-Pointer This proved an innovative approach to define alternatives and scope of assessment.

  9. An analytical approach to the problem of inverse optimization with additive objective functions: an application to human prehension.

    PubMed

    Terekhov, Alexander V; Pesin, Yakov B; Niu, Xun; Latash, Mark L; Zatsiorsky, Vladimir M

    2010-09-01

    We consider the problem of what is being optimized in human actions with respect to various aspects of human movements and different motor tasks. From the mathematical point of view this problem consists of finding an unknown objective function given the values at which it reaches its minimum. This problem is called the inverse optimization problem. Until now the main approach to this problems has been the cut-and-try method, which consists of introducing an objective function and checking how it reflects the experimental data. Using this approach, different objective functions have been proposed for the same motor action. In the current paper we focus on inverse optimization problems with additive objective functions and linear constraints. Such problems are typical in human movement science. The problem of muscle (or finger) force sharing is an example. For such problems we obtain sufficient conditions for uniqueness and propose a method for determining the objective functions. To illustrate our method we analyze the problem of force sharing among the fingers in a grasping task. We estimate the objective function from the experimental data and show that it can predict the force-sharing pattern for a vast range of external forces and torques applied to the grasped object. The resulting objective function is quadratic with essentially non-zero linear terms.

  10. An analytical approach to the problem of inverse optimization with additive objective functions: an application to human prehension

    PubMed Central

    Pesin, Yakov B.; Niu, Xun; Latash, Mark L.

    2010-01-01

    We consider the problem of what is being optimized in human actions with respect to various aspects of human movements and different motor tasks. From the mathematical point of view this problem consists of finding an unknown objective function given the values at which it reaches its minimum. This problem is called the inverse optimization problem. Until now the main approach to this problems has been the cut-and-try method, which consists of introducing an objective function and checking how it reflects the experimental data. Using this approach, different objective functions have been proposed for the same motor action. In the current paper we focus on inverse optimization problems with additive objective functions and linear constraints. Such problems are typical in human movement science. The problem of muscle (or finger) force sharing is an example. For such problems we obtain sufficient conditions for uniqueness and propose a method for determining the objective functions. To illustrate our method we analyze the problem of force sharing among the fingers in a grasping task. We estimate the objective function from the experimental data and show that it can predict the force-sharing pattern for a vast range of external forces and torques applied to the grasped object. The resulting objective function is quadratic with essentially non-zero linear terms. PMID:19902213

  11. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    PubMed

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  12. Variance decomposition: a tool enabling strategic improvement of the precision of analytical recovery and concentration estimates associated with microorganism enumeration methods.

    PubMed

    Schmidt, P J; Emelko, M B; Thompson, M E

    2014-05-15

    Concentrations of particular types of microorganisms are commonly measured in various waters, yet the accuracy and precision of reported microorganism concentration values are often questioned due to the imperfect analytical recovery of quantitative microbiological methods and the considerable variation among fully replicated measurements. The random error in analytical recovery estimates and unbiased concentration estimates may be attributable to several sources, and knowing the relative contribution from each source can facilitate strategic design of experiments to yield more precise data or provide an acceptable level of information with fewer data. Herein, variance decomposition using the law of total variance is applied to previously published probabilistic models to explore the relative contributions of various sources of random error and to develop tools to aid experimental design. This work focuses upon enumeration-based methods with imperfect analytical recovery (such as enumeration of Cryptosporidium oocysts), but the results also yield insights about plating methods and microbial methods in general. Using two hypothetical analytical recovery profiles, the variance decomposition method is used to explore 1) the design of an experiment to quantify variation in analytical recovery (including the size and precision of seeding suspensions and the number of samples), and 2) the design of an experiment to estimate a single microorganism concentration (including sample volume, effects of improving analytical recovery, and replication). In one illustrative example, a strategically designed analytical recovery experiment with 6 seeded samples would provide as much information as an alternative experiment with 15 seeded samples. Several examples of diminishing returns are illustrated to show that efforts to reduce error in analytical recovery and concentration estimates can have negligible effect if they are directed at trivial error sources.

  13. Cisapride a green analytical reagent for rapid and sensitive determination of bromate in drinking water, bread and flour additives by oxidative coupling spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Al Okab, Riyad Ahmed

    2013-02-01

    Green analytical methods using Cisapride (CPE) as green analytical reagent was investigated in this work. Rapid, simple, and sensitive spectrophotometric methods for the determination of bromate in water sample, bread and flour additives were developed. The proposed methods based on the oxidative coupling between phenoxazine and Cisapride in the presence of bromate to form red colored product with max at 520 nm. Phenoxazine and Cisapride and its reaction products were found to be environmentally friendly under the optimum experimental condition. The method obeys beers law in concentration range 0.11-4.00 g ml-1 and molar absorptivity 1.41 × 104 L mol-1 cm-1. All variables have been optimized and the presented reaction sequences were applied to the analysis of bromate in water, bread and flour additive samples. The performance of these method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference method. The combination of pharmaceutical drugs reagents with low concentration create some unique green chemical analyses.

  14. Cisapride a green analytical reagent for rapid and sensitive determination of bromate in drinking water, bread and flour additives by oxidative coupling spectrophotometric methods.

    PubMed

    Al Okab, Riyad Ahmed

    2013-02-15

    Green analytical methods using Cisapride (CPE) as green analytical reagent was investigated in this work. Rapid, simple, and sensitive spectrophotometric methods for the determination of bromate in water sample, bread and flour additives were developed. The proposed methods based on the oxidative coupling between phenoxazine and Cisapride in the presence of bromate to form red colored product with max at 520 nm. Phenoxazine and Cisapride and its reaction products were found to be environmentally friendly under the optimum experimental condition. The method obeys beers law in concentration range 0.11-4.00 g ml(-1) and molar absorptivity 1.41 × 10(4) L mol(-1)cm(-1). All variables have been optimized and the presented reaction sequences were applied to the analysis of bromate in water, bread and flour additive samples. The performance of these method was evaluated in terms of Student's t-test and variance ratio F-test to find out the significance of proposed methods over the reference method. The combination of pharmaceutical drugs reagents with low concentration create some unique green chemical analyses.

  15. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  16. A Serious Videogame as an Additional Therapy Tool for Training Emotional Regulation and Impulsivity Control in Severe Gambling Disorder

    PubMed Central

    Tárrega, Salomé; Castro-Carreras, Laia; Fernández-Aranda, Fernando; Granero, Roser; Giner-Bartolomé, Cristina; Aymamí, Neus; Gómez-Peña, Mónica; Santamaría, Juan J.; Forcano, Laura; Steward, Trevor; Menchón, José M.; Jiménez-Murcia, Susana

    2015-01-01

    Background: Gambling disorder (GD) is characterized by a significant lack of self-control and is associated with impulsivity-related personality traits. It is also linked to deficits in emotional regulation and frequently co-occurs with anxiety and depression symptoms. There is also evidence that emotional dysregulation may play a mediatory role between GD and psychopathological symptomatology. Few studies have reported the outcomes of psychological interventions that specifically address these underlying processes. Objectives: To assess the utility of the Playmancer platform, a serious video game, as an additional therapy tool in a CBT intervention for GD, and to estimate pre-post changes in measures of impulsivity, anger expression and psychopathological symptomatology. Method: The sample comprised a single group of 16 male treatment-seeking individuals with severe GD diagnosis. Therapy intervention consisted of 16 group weekly CBT sessions and, concurrently, 10 additional weekly sessions of a serious video game. Pre-post treatment scores on South Oaks Gambling Screen (SOGS), Barratt Impulsiveness Scale (BIS-11), I7 Impulsiveness Questionnaire (I7), State-Trait Anger Expression Inventory 2 (STAXI-2), Symptom Checklist-Revised (SCL-90-R), State-Trait Anxiety Inventory (STAI-S-T), and Novelty Seeking from the Temperament and Character Inventory-Revised (TCI-R) were compared. Results: After the intervention, significant changes were observed in several measures of impulsivity, anger expression and other psychopathological symptoms. Dropout and relapse rates during treatment were similar to those described in the literature for CBT. Conclusion: Complementing CBT interventions for GD with a specific therapy approach like a serious video game might be helpful in addressing certain underlying factors which are usually difficult to change, including impulsivity and anger expression. PMID:26617550

  17. Additive technology of soluble mold tooling for embedded devices in composite structures: A study on manufactured tolerances

    NASA Astrophysics Data System (ADS)

    Roy, Madhuparna

    Composite textiles have found widespread use and advantages in various industries and applications. The constant demand for high quality products and services requires companies to minimize their manufacturing costs, and delivery time in order to compete in general and niche marketplaces. Advanced manufacturing methods aim to provide economical methods of mold production. Creation of molding and tooling options for advanced composites encompasses a large portion of the fabrication time, making it a costly process and restraining factor. This research discusses a preliminary investigation into the use of soluble polymer compounds and additive manufacturing to fabricate soluble molds. These molds suffer from dimensional errors due to several factors, which have also been characterized. The basic soluble mold of a composite is 3D printed to meet the desired dimensions and geometry of holistic structures or spliced components. The time taken to dissolve the mold depends on the rate of agitation of the solvent. This process is steered towards enabling the implantation of optoelectronic devices within the composite to provide sensing capability for structural health monitoring. The shape deviation of the 3D printed mold is also studied and compared to its original dimensions to optimize the dimensional quality to produce dimensionally accurate parts. Mechanical tests were performed on compact tension (CT) resin samples prepared from these 3D printed molds and revealed crack propagation towards an embedded intact optical fiber.

  18. Demonstration of the Recent Additions in Modeling Capabilities for the WEC-Sim Wave Energy Converter Design Tool: Preprint

    SciTech Connect

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-03-01

    WEC-Sim is a mid-fidelity numerical tool for modeling wave energy conversion (WEC) devices. The code uses the MATLAB SimMechanics package to solve the multi-body dynamics and models the wave interactions using hydrodynamic coefficients derived from frequency domain boundary element methods. In this paper, the new modeling features introduced in the latest release of WEC-Sim will be presented. The first feature discussed is the conversion of the fluid memory kernel to a state-space approximation that provides significant gains in computational speed. The benefit of the state-space calculation becomes even greater after the hydrodynamic body-to-body coefficients are introduced as the number of interactions increases exponentially with the number of floating bodies. The final feature discussed is the capability toadd Morison elements to provide additional hydrodynamic damping and inertia. This is generally used as a tuning feature, because performance is highly dependent on the chosen coefficients. In this paper, a review of the hydrodynamic theory for each of the features is provided and successful implementation is verified using test cases.

  19. Building Adoption of Visual Analytics Software

    SciTech Connect

    Chinchor, Nancy; Cook, Kristin A.; Scholtz, Jean

    2012-01-05

    Adoption of technology is always difficult. Issues such as having the infrastructure necessary to support the technology, training for users, integrating the technology into current processes and tools, and having the time, managerial support, and necessary funds need to be addressed. In addition to these issues, the adoption of visual analytics tools presents specific challenges that need to be addressed. This paper discusses technology adoption challenges and approaches for visual analytics technologies.

  20. A Review of Energy Dispersive X-Ray Fluorescence (EDXRF) as an Analytical Tool in Numismatic Studies.

    PubMed

    Navas, María José; Asuero, Agustín García; Jiménez, Ana María

    2016-01-01

    Energy dispersive X-ray fluorescence spectrometry (EDXRF) as an analytical technique in studies of ancient coins is summarized and reviewed. Specific EDXRF applications in historical studies, in studies of the corrosion of coins, and in studies of the optimal working conditions of some laser-based treatment for the cleaning of coins are described.

  1. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  2. Modelling turbulent boundary layer flow over fractal-like multiscale terrain using large-eddy simulations and analytical tools.

    PubMed

    Yang, X I A; Meneveau, C

    2017-04-13

    In recent years, there has been growing interest in large-eddy simulation (LES) modelling of atmospheric boundary layers interacting with arrays of wind turbines on complex terrain. However, such terrain typically contains geometric features and roughness elements reaching down to small scales that typically cannot be resolved numerically. Thus subgrid-scale models for the unresolved features of the bottom roughness are needed for LES. Such knowledge is also required to model the effects of the ground surface 'underneath' a wind farm. Here we adapt a dynamic approach to determine subgrid-scale roughness parametrizations and apply it for the case of rough surfaces composed of cuboidal elements with broad size distributions, containing many scales. We first investigate the flow response to ground roughness of a few scales. LES with the dynamic roughness model which accounts for the drag of unresolved roughness is shown to provide resolution-independent results for the mean velocity distribution. Moreover, we develop an analytical roughness model that accounts for the sheltering effects of large-scale on small-scale roughness elements. Taking into account the shading effect, constraints from fundamental conservation laws, and assumptions of geometric self-similarity, the analytical roughness model is shown to provide analytical predictions that agree well with roughness parameters determined from LES.This article is part of the themed issue 'Wind energy in complex terrains'.

  3. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  4. Exploration and classification of chromatographic fingerprints as additional tool for identification and quality control of several Artemisia species.

    PubMed

    Alaerts, Goedele; Pieters, Sigrid; Logie, Hans; Van Erps, Jürgen; Merino-Arévalo, Maria; Dejaegher, Bieke; Smeyers-Verbeke, Johanna; Vander Heyden, Yvan

    2014-07-01

    The World Health Organization accepts chromatographic fingerprints as a tool for identification and quality control of herbal medicines. This is the first study in which the distinction, identification and quality control of four different Artemisia species, i.e. Artemisia vulgaris, A. absinthium, A. annua and A. capillaris samples, is performed based on the evaluation of entire chromatographic fingerprint profiles developed with identical experimental conditions. High-Performance Liquid Chromatography (HPLC) with Diode Array Detection (DAD) was used to develop the fingerprints. Application of factorial designs leads to methanol/water (80:20 (v/v)) as the best extraction solvent for the pulverised plant material and to a shaking bath for 30 min as extraction method. Further, so-called screening, optimisation and fine-tuning phases were performed during fingerprint development. Most information about the different Artemisia species, i.e. the highest number of separated peaks in the fingerprint, was acquired on four coupled Chromolith columns (100 mm × 4.6 mm I.D.). Trifluoroacetic acid 0.05% (v/v) was used as mobile-phase additive in a stepwise linear methanol/water gradient, i.e. 5, 34, 41, 72 and 95% (v/v) methanol at 0, 9, 30, 44 and 51 min, where the last mobile phase composition was kept isocratic till 60 min. One detection wavelength was selected to perform data analysis. The lowest similarity between the fingerprints of the four species was present at 214 nm. The HPLC/DAD method was applied on 199 herbal samples of the four Artemisia species, resulting in 357 fingerprints. The within- and between-day variation of the entire method, as well as the quality control fingerprints obtained during routine analysis, were found acceptable. The distinction of these Artemisia species was evaluated based on the entire chromatographic profiles, developed by a shared method, and visualised in score plots by means of the Principal Component Analysis (PCA) exploratory data

  5. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    ERIC Educational Resources Information Center

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  6. Novel Applications of Lanthanoides as Analytical or Diagnostic Tools in the Life Sciences by ICP-MS-based Techniques

    NASA Astrophysics Data System (ADS)

    Müller, Larissa; Traub, Heike; Jakubowski, Norbert

    2016-11-01

    Inductively coupled plasma mass spectrometry (ICP-MS) is a well-established analytical method for multi-elemental analysis in particular for elements at trace and ultra-trace levels. It has found acceptance in various application areas during the last decade. ICP-MS is also more and more applied for detection in the life sciences. For these applications, ICP-MS excels by a high sensitivity, which is independent of the molecular structure of the analyte, a wide linear dynamic range and by excellent multi-element capabilities. Furthermore, methods based on ICP-MS offer simple quantification concepts, for which usually (liquid) standards are applied, low matrix effects compared to other conventional bioanalytical techniques, and relative limits of detection (LODs) in the low pg g-1 range and absolute LODs down to the attomol range. In this chapter, we focus on new applications where the multi-element capability of ICP-MS is used for detection of lanthanoides or rare earth elements, which are applied as elemental stains or tags of biomolecules and in particular of antibodies.

  7. Peak-bridges due to in-column analyte transformations as a new tool for establishing molecular connectivities by comprehensive two-dimensional gas chromatography-mass spectrometry.

    PubMed

    Filippi, Jean-Jacques; Cocolo, Nicolas; Meierhenrich, Uwe J

    2015-02-27

    Comprehensive two-dimensional gas chromatography-mass spectrometry (GC×GC-MS) has been shown to permit for the unprecedented chromatographic resolution of volatile analytes encompassing various families of organic compounds. However, peak identification based on retention time, two-dimensional mapping, and mass spectrometric fragmentation only, is not a straightforward task yet. The possibility to establish molecular links between constituents is of crucial importance to understand the overall chemistry of any sample, especially in natural extracts where biogenetically related isomeric structures are often abundant. We here present a new way of using GC×GC that allows searching for those molecular connectivities. Analytical investigations of essential oil constituents by means of GC×GC-MS permitted to observe in real time the thermally-induced transformations of various sesquiterpenic derivatives. These transformations generated a series of well-defined two-dimensional peak bridges within the 2D-chromatograms connecting parent and daughter molecules, thus permitting to build a clear scheme of structural relationship between the different constituents. GC×GC-MS appears here as a tool for investigating chromatographic phenomena and analyte transformations that could not be understood with conventional GC-MS only.

  8. Extensions of the Johnson-Neyman Technique to Linear Models With Curvilinear Effects: Derivations and Analytical Tools.

    PubMed

    Miller, Jason W; Stromeyer, William R; Schwieterman, Matthew A

    2013-03-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way interactions in several types of linear models, this method has not been extended to include quadratic terms or more complicated models involving quadratic terms and interactions. Curvilinear relations of this type are incorporated in several theories in the social sciences. This article extends the J-N method to such linear models along with presenting freely available online tools that implement this technique as well as the traditional pick-a-point approach. Algebraic and graphical representations of the proposed J-N extension are provided. An example is presented to illustrate the use of these tools and the interpretation of findings. Issues of reliability as well as "spurious moderator" effects are discussed along with recommendations for future research.

  9. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide.

    PubMed

    Tawakkol, Shereen M; Farouk, M; Elaziz, Omar Abd; Hemdan, A; Shehata, Mostafa A

    2014-12-10

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  10. Medical technology assessment: the use of the analytic hierarchy process as a tool for multidisciplinary evaluation of medical devices.

    PubMed

    Hummel, J M; van Rossum, W; Verkerke, G J; Rakhorst, G

    2000-11-01

    Most types of medical technology assessment are performed only after the technology has been developed. Consequently, they have only minor effects on changes in clinical practice. Our study introduces a new method of constructive medical technology assessment that can change the development and diffusion of a medical device to improve its later clinical effectiveness. The method, based on Saaty's Analytic Hierarchy Process, quantitatively supports discussions between various parties involved in technological development and diffusion. We applied this method in comparing a new blood pump with two competitors based on technical, medical and social requirements. These discussions changed the evaluators' perspectives, reduced diasagreements, and ended in a reliable evaluation of the pump's performance. On the basis of these results, adaptations were derived which improved the design and diffusion of the blood pump. This application shows the adequate potential of our method to steer technological development and diffusion of artificial organs.

  11. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  12. Analytical methods for SiO2 and other inorganic oxides in titanium dioxide or certain silicates for food additive specifications.

    PubMed

    Mutsuga, Motoh; Sato, Kyoko; Hirahara, Yoshichika; Kawamura, Yoko

    2011-04-01

    An analytical method has been developed for the detection of SiO(2) and other oxides in titanium dioxide and certain silicates used in food additives using inductively coupled plasma (ICP) atomic emission spectrometry without hydrofluoric acid. SiO(2) and other oxides in titanium dioxide or certain silicates were resolved by alkali fusion with KOH and boric acid and then dissolved in dilute hydrochloric acid as a test solution for ICP. The recovery of SiO(2) and Al(2)O(3) added at 0.1 and 1.0%, respectively, in TiO(2) was 88-104%; coefficient of variation was <4%. The limit of determination of SiO(2) and Al(2)O(3) was about 0.08%, and the accuracy of the ICP method was better than that of the Joint FAO/WHO Expert Committee on Food Additives (JECFA) test method. The recovery of SiO(2) and other oxides in silicates was 95-107% with a coefficient of variation of <4%. Using energy dispersive X-ray fluorescence spectrometry (EDX) with fundamental parameter determination, the content of SiO(2) and other oxide in titanium dioxide and silicate showed good agreement with the ICP results. ICP with alkali fusion proved suitable as a test method for SiO(2), Al(2)O(3) and other oxides in titanium dioxide and certain silicates, and EDX proves useful for screening such impurities in titanium dioxide and componential analysis of certain silicates.

  13. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  14. In situ protein secondary structure determination in ice: Raman spectroscopy-based process analytical tool for frozen storage of biopharmaceuticals.

    PubMed

    Roessl, Ulrich; Leitgeb, Stefan; Pieters, Sigrid; De Beer, Thomas; Nidetzky, Bernd

    2014-08-01

    A Raman spectroscopy-based method for in situ monitoring of secondary structural composition of proteins during frozen and thawed storage was developed. A set of reference proteins with different α-helix and β-sheet compositions was used for calibration and validation in a chemometric approach. Reference secondary structures were quantified with circular dichroism spectroscopy in the liquid state. Partial least squares regression models were established that enable estimation of secondary structure content from Raman spectra. Quantitative secondary structure determination in ice was accomplished for the first time and correlation with existing (qualitative) protein structural data from the frozen state was achieved. The method can be used in the presence of common stabilizing agents and is applicable in an industrial freezer setup. Raman spectroscopy represents a powerful, noninvasive, and flexibly applicable tool for protein stability monitoring during frozen storage.

  15. Dynamic 3D visual analytic tools: a method for maintaining situational awareness during high tempo warfare or mass casualty operations

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.

    2010-04-01

    Maintaining Situational Awareness (SA) is crucial to the success of high tempo operations, such as war fighting and mass casualty events (bioterrorism, natural disasters). Modern computer and software applications attempt to provide command and control manager's situational awareness via the collection, integration, interrogation and display of vast amounts of analytic data in real-time from a multitude of data sources and formats [1]. At what point does the data volume and displays begin to erode the hierarchical distributive intelligence, command and control structure of the operation taking place? In many cases, people tasked with making decisions, have insufficient experience in SA of high tempo operations and become overwhelmed easily as vast amounts of data begin to be displayed in real-time as an operation unfolds. In these situations, where data is plentiful and the relevance of the data changes rapidly, there is a chance for individuals to target fixate on those data sources they are most familiar. If these individuals fall into this type of pitfall, they will exclude other data that might be just as important to the success of the operation. To counter these issues, it is important that the computer and software applications provide a means for prompting its users to take notice of adverse conditions or trends that are critical to the operation. This paper will discuss a new method of displaying data called a Crisis ViewTM, that monitors critical variables that are dynamically changing and allows preset thresholds to be created to prompt the user when decisions need to be made and when adverse or positive trends are detected. The new method will be explained in basic terms, with examples of its attributes and how it can be implemented.

  16. Spatial and Temporal Oxygen Dynamics in Macrofaunal Burrows in Sediments: A Review of Analytical Tools and Observational Evidence

    PubMed Central

    Satoh, Hisashi; Okabe, Satoshi

    2013-01-01

    The availability of benthic O2 plays a crucial role in benthic microbial communities and regulates many important biogeochemical processes. Burrowing activities of macrobenthos in the sediment significantly affect O2 distribution and its spatial and temporal dynamics in burrows, followed by alterations of sediment microbiology. Consequently, numerous research groups have investigated O2 dynamics in macrofaunal burrows. The introduction of powerful tools, such as microsensors and planar optodes, to sediment analysis has greatly enhanced our ability to measure O2 dynamics in burrows at high spatial and temporal resolution with minimal disturbance of the physical structure of the sediment. In this review, we summarize recent studies of O2-concentration measurements in burrows with O2 microsensors and O2 planar optodes. This manuscript mainly focuses on the fundamentals of O2 microsensors and O2 planar optodes, and their application in the direct measurement of the spatial and temporal dynamics of O2 concentrations in burrows, which have not previously been reviewed, and will be a useful supplement to recent literature reviews on O2 dynamics in macrofaunal burrows. PMID:23594972

  17. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals.

  18. From Data to Knowledge – Promising Analytical Tools and Techniques for Capture and Reuse of Corporate Knowledge and to Aid in the State Evaluation Process

    SciTech Connect

    Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.

    2010-10-29

    The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usable search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.

  19. An emerging micro-scale immuno-analytical diagnostic tool to see the unseen. Holding promise for precision medicine and P4 medicine.

    PubMed

    Guzman, Norberto A; Guzman, Daniel E

    2016-05-15

    Over the years, analytical chemistry and immunology have contributed significantly to the field of clinical diagnosis by introducing quantitative techniques that can detect crucial and distinct chemical, biochemical and cellular biomarkers present in biosamples. Currently, quantitative two-dimensional hybrid immuno-analytical separation technologies are emerging as powerful tools for the sequential isolation, separation and detection of protein panels, including those with subtle structural changes such as variants, isoforms, peptide fragments, and post-translational modifications. One such technique to perform this challenging task is immunoaffinity capillary electrophoresis (IACE), which combines the use of antibodies and/or other affinity ligands as highly selective capture agents with the superior resolving power of capillary electrophoresis. Since affinity ligands can be polyreactive, i.e., binding and capturing more than one molecule, they may generate false positive results when tested under mono-dimensional procedures; one such application is enzyme-linked immunosorbent assay (ELISA). IACE, on the other hand, is a two-dimensional technique that captures (isolation and enrichment), releases, separates and detects (quantification, identification and characterization) a single or a panel of analytes from a sample, when coupled to one or more detectors simultaneously, without the presence of false positive or false negative data. This disruptive technique, capable of preconcentrate on-line results in enhanced sensitivity even in the analysis of complex matrices, may change the traditional system of testing biomarkers to obtain more accurate diagnosis of diseases, ideally before symptoms of a specific disease manifest. In this manuscript, we will present examples of the determination of biomarkers by IACE and the design of a miniaturized multi-dimensional IACE apparatus capable of improved sensitivity, specificity and throughput, with the potential of being used

  20. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  1. Indirect additive manufacturing as an elegant tool for the production of self-supporting low density gelatin scaffolds.

    PubMed

    Van Hoorick, Jasper; Declercq, Heidi; De Muynck, Amelie; Houben, Annemie; Van Hoorebeke, Luc; Cornelissen, Ria; Van Erps, Jürgen; Thienpont, Hugo; Dubruel, Peter; Van Vlierberghe, Sandra

    2015-10-01

    The present work describes for the first time the production of self-supporting low gelatin density (<10 w/v%) porous scaffolds using methacrylamide-modified gelatin as an extracellular matrix mimicking component. As porous scaffolds starting from low gelatin concentrations cannot be realized with the conventional additive manufacturing techniques in the abscence of additives, we applied an indirect fused deposition modelling approach. To realize this, we have printed a sacrificial polyester scaffold which supported the hydrogel material during UV crosslinking, thereby preventing hydrogel structure collapse. After complete curing, the polyester scaffold was selectively dissolved leaving behind a porous, interconnective low density gelatin scaffold. Scaffold structural analysis indicated the success of the selected indirect additive manufacturing approach. Physico-chemical testing revealed scaffold properties (mechanical, degradation, swelling) to depend on the applied gelatin concentration and methacrylamide content. Preliminary biocompatibility studies revealed the cell-interactive and biocompatible properties of the materials developed.

  2. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine.

  3. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    PubMed

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices.

  4. CorRECTreatment: A Web-based Decision Support Tool for Rectal Cancer Treatment that Uses the Analytic Hierarchy Process and Decision Tree

    PubMed Central

    Karakülah, G.; Dicle, O.; Sökmen, S.; Çelikoğlu, C.C.

    2015-01-01

    Summary Background The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians’ decision making. Objective The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. Methods The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. Results In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. Conclusions The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options

  5. Evaluation of automated Wes system as an analytical and characterization tool to support monoclonal antibody drug product development.

    PubMed

    Wang, Jinyu; Valdez, Anulfo; Chen, Yingchen

    2016-12-21

    Monitoring and evaluation of critical quality attributes (cQA) in monoclonal antibodies (mAb) are a regulatory requirement in pharmaceutical industry. High molecular weight (HMW) species are of critical importance due to the potential risk associated with immunogenicity. HMW species are typically monitored by size exclusion chromatography (SEC). Although low molecular weight (LMW) species are also detected by SEC, low-resolution separation of LMW limits its capability to monitor mAb fragmentation. Recently, we have developed new methods for LMW characterization and evaluation based on the Wes instrument from ProteinSimple. The capillary western blot is based upon size-based separation in a capillary system, and detection by specific immunoprobing, following the separation. The capability of this method for characterization of mAb fragments were demonstrated. The characterization was achieved by probing two antibodies targeted to specific regions (Fc region or Fab region) of IgG1 protein. The specificity of these two antibodies was evaluated against F (ab') 2 and Fc/2 fragments generated from Ides enzyme treated IgG1 protein. The results showed the selected antibodies provide high specificity to F (ab') 2 and Fc/2 fragments. Fractions collected from SEC were used to evaluate this method. The detected fragments from SEC fractions were identified based on their estimated molecular weight and antibody detection. The result proved the capability of the capillary western blot as a characterization method for IgG1 fragments. In addition, with the specific detection to IgG1 and IgG4, the power of the capillary western blot to specifically characterize and evaluate individual IgG fragmentations in an IgG1 and IgG4 mixture was also demonstrated. When heat stressed samples were used, results showed method capability as stability indicating in IgG1 and IgG4 mixture samples. The stressed mixture samples were also evaluated by the total protein assay in which protein samples

  6. The life closure scale: additional psychometric testing of a tool to measure psychological adaptation in death and dying.

    PubMed

    Dobratz, Marjorie C

    2004-02-01

    The purpose of this study was to conduct additional psychometric testing on an instrument designed to measure psychological adaptation in end-of-life populations across a wide spectrum of terminal illnesses. A sample of 20 participants completed initial testing of the Life Closure Scale (LCS); however, its usefulness was limited by the small sample size. A larger sample of 113 home hospice individuals who met established criteria and who gave informed consent completed the 27-item LCS for additional psychometric testing. Cronbach's alphas and correlation coefficients were computed, and factor analysis was conducted to establish internal consistency reliability, theoretical clarity, and criterion-related validity. The number of scale items was reduced to 20, with a total alpha of.87. Cronbach's alphas for the two subscales were.80 (self-reconciled) and.82 (self-restructuring). Item-total correlations for the subscales ranged from a low of.37 to a high of.68, with confirmatory factor analysis yielding two loadings. These findings lend credence to the usefulness of the LCS in measuring psychological adaptation in dying persons.

  7. Effect of Ti Addition on Carbide Modification and the Microscopic Simulation of Impact Toughness in High-Carbon Cr-V Tool Steels

    NASA Astrophysics Data System (ADS)

    Cho, Ki Sub; Kim, Sang Il; Park, Sung Soo; Choi, Won Suk; Moon, Hee Kwon; Kwon, Hoon

    2016-01-01

    In D7 tool steel, which contains high levels of primary carbides, the influence of carbide modification by Ti addition was quantitatively analyzed. Considering the Griffith-Irwin energy criterion for crack growth, the impact energy was evaluated by substituting a microscopic factor of the normalized number density of carbides cracked during hardness indentation tests for the crack length. The impact energy was enhanced with Ti addition because Ti reduced and refined the primary M7C3 carbide phase of elongated morphology, reducing the probability of crack generation.

  8. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    PubMed Central

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  9. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-04-30

    burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...milestones. Prior to this, there was no formal requirement to reconcile program acquisition baselines with resource forecasts beyond the five years of...programs at key milestones. Prior to this, there was no formal requirement to reconcile program acquisition baselines with resource forecasts beyond the

  10. Analytical tools in accelerator physics

    SciTech Connect

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  11. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    production costs over time, in the context of  The cost and schedules of the other programs in the relevant acquisition portfolio  The...Procurement + RDT&E Portfolio $B POM14-18 …FY43 All Portfolios Under Component TOA F Transportation (Procurement + RDT&E) TWV O&M Requirements BY12 T...Portion of O&M Procurement + RDT&E Portfolios New System Total Lifecycle Costs (total reserved profile) Recommended Submission Formats (DAG) 5

  12. Determination of Unknown Concentrations of Sodium Acetate Using the Method of Standard Addition and Proton NMR: An Experiment for the Undergraduate Analytical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Rajabzadeh, Massy

    2012-01-01

    In this experiment, students learn how to find the unknown concentration of sodium acetate using both the graphical treatment of standard addition and the standard addition equation. In the graphical treatment of standard addition, the peak area of the methyl peak in each of the sodium acetate standard solutions is found by integration using…

  13. Evaluation of manometric temperature measurement (MTM), a process analytical technology tool in freeze drying, part III: heat and mass transfer measurement.

    PubMed

    Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J

    2006-01-01

    This article evaluates the procedures for determining the vial heat transfer coefficient and the extent of primary drying through manometric temperature measurement (MTM). The vial heat transfer coefficients (Kv) were calculated from the MTM-determined temperature and resistance and compared with Kv values determined by a gravimetric method. The differences between the MTM vial heat transfer coefficients and the gravimetric values are large at low shelf temperature but smaller when higher shelf temperatures were used. The differences also became smaller at higher chamber pressure and smaller when higher resistance materials were being freeze-dried. In all cases, using thermal shields greatly improved the accuracy of the MTM Kv measurement. With use of thermal shields, the thickness of the frozen layer calculated from MTM is in good agreement with values obtained gravimetrically. The heat transfer coefficient "error" is largely a direct result of the error in the dry layer resistance (ie, MTM-determined resistance is too low). This problem can be minimized if thermal shields are used for freeze-drying. With suitable use of thermal shields, accurate Kv values are obtained by MTM; thus allowing accurate calculations of heat and mass flow rates. The extent of primary drying can be monitored by real-time calculation of the amount of remaining ice using MTM data, thus providing a process analytical tool that greatly improves the freeze-drying process design and control.

  14. Effect of Percent Relative Humidity, Moisture Content, and Compression Force on Light-Induced Fluorescence (LIF) Response as a Process Analytical Tool.

    PubMed

    Shah, Ishan G; Stagner, William C

    2016-08-01

    The effect of percent relative humidity (16-84% RH), moisture content (4.2-6.5% w/w MC), and compression force (4.9-44.1 kN CF) on the light-induced fluorescence (LIF) response of 10% w/w active pharmaceutical ingredient (API) compacts is reported. The fluorescent response was evaluated using two separate central composite designs of experiments. The effect of % RH and CF on the LIF signal was highly significant with an adjusted R (2)  = 0.9436 and p < 0.0001. Percent relative humidity (p = 0.0022), CF (p < 0.0001), and % RH(2) (p = 0.0237) were statistically significant factors affecting the LIF response. The effects of MC and CF on LIF response were also statistically significant with a p value <0.0001 and adjusted R (2) value of 0.9874. The LIF response was highly impacted by MC (p < 0.0001), CF (p < 0.0001), and MC(2) (p = 0022). At 10% w/w API, increased % RH, MC, and CF led to a nonlinear decrease in LIF response. The derived quadratic model equations explained more than 94% of the data. Awareness of these effects on LIF response is critical when implementing LIF as a process analytical tool.

  15. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    PubMed

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of <0.0001 and R(2) of 0.886. A central composite response surface statistical design was used to evaluate the effect of granulator screw speed and Comil® impeller speed on the length-weighted chord length distribution (CLD) and particle size distribution (PSD) determined by FBRM and nested sieve analysis, respectively. The effect of granulator speed and mill speed on bulk density, tapped density, Compressibility Index, and Flowability Index were also investigated. An inline FBRM probe placed below the Comil-generated chord lengths and CLD data at designated times. The collection of the milled samples for sieve analysis and PSD evaluation were coordinated with the timing of the FBRM determinations. Both FBRM and sieve analysis resulted in similar bimodal distributions for all ten manufactured batches studied. Within the experimental space studied, the granulator screw speed (650-850 rpm) and Comil® impeller speed (1,000-2,000 rpm) did not have a significant effect on CLD, PSD, bulk density, tapped density, Compressibility Index, and Flowability Index (p value > 0.05).

  16. LC/ESI-MS n and 1H HR-MAS NMR analytical methods as useful taxonomical tools within the genus Cystoseira C. Agardh (Fucales; Phaeophyceae).

    PubMed

    Jégou, Camille; Culioli, Gérald; Kervarec, Nelly; Simon, Gaëlle; Stiger-Pouvreau, Valérie

    2010-12-15

    Species of the genus Cystoseira are particularly hard to discriminate, due to the complexity of their morphology, which can be influenced by their phenological state and ecological parameters. Our study emphasized on the relevance of two kinds of analytical tools, (1) LC/ESI-MS(n) and (2) (1)H HR-MAS NMR, also called in vivo NMR, to identify Cystoseira specimens at the specific level and discuss their taxonomy. For these analyses, samples were collected at several locations in Brittany (France), where Cystoseira baccata, C. foeniculacea, C. humilis, C. nodicaulis and C. tamariscifolia were previously reported. To validate our chemical procedure, the sequence of the ITS2 has been obtained for each species to investigate their phylogenetic relationships at a molecular level. Our study highlighted the consistency of the two physico-chemical methods, compared to "classical" molecular approach, in studying taxonomy within the genus Cystoseira. Especially, LC/ESI-MS(n) and phylogenetic analyses converged into the discrimination of two taxonomical groups among the 5 species. The occurrence of some specific signals in the (1)H HR-MAS NMR spectra and/or some characteristic chemical compounds during LC/ESI-MS(n) analysis could be regarded as discriminating factors. LC/ESI-MS(n) and (1)H HR-MAS NMR turned out to be two relevant and innovative techniques to discriminate taxonomically this complex genus.

  17. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    SciTech Connect

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  18. Stable carbon isotopic ratio measurement of polycyclic aromatic hydrocarbons as a tool for source identification and apportionment--a review of analytical methodologies.

    PubMed

    Buczyńska, A J; Geypens, B; Van Grieken, R; De Wael, K

    2013-02-15

    The measurement of the ratio of stable isotopes of carbon ((13)C/(12)C expressed as a δ(13)C) in the individual components of a sample may be used as a means to identify the origin of these components. This article reviews the approaches and reports on the successes and failures of source identification and apportionment of Polycyclic Aromatic Hydrocarbons (PAHs) with the use of compound-specific isotope analysis (CSIA). One of the conditions for a precise and accurate analysis of isotope ratios with the use of GC-C-IRMS is the need for well separated peaks, with no co-elutions, and reduced unresolved complex mixture (UCM). Additionally, special care needs to be taken for an investigation of possible isotope fractionation effects introduced during the analytical treatment of samples. With the above-mentioned problems in mind, this review discusses in detail and compares current laboratory methodologies, mainly in the extraction and subsequent clean-up techniques used for environmental samples (air particulate matter, soil and sediments). Sampling strategies, the use of isotopic internal standards and the ranges for precision and accuracy are also reported and discussed.

  19. Dendritic silver nanostructures obtained via one-step electrosynthesis: effect of nonanesulfonic acid and polyvinylpyrrolidone as additives on the analytical performance for hydrogen peroxide sensing

    NASA Astrophysics Data System (ADS)

    Guadagnini, Lorella; Ballarin, Barbara; Tonelli, Domenica

    2013-10-01

    The electrochemical deposition of silver nanodendrites (AgNDs) on pure graphite sheet (PGS) electrodes, both in the absence of surfactant/templates and in the presence of 1-nonanesulfonic acid (NS) or polyvinylpyrrolidone (PVP) additives, is reported. The synthesis carried out without additives and with NS produced a bigger amount of large size AgNDs (dimension of 1-5 μm), with scarce influence played by NS, while the deposition with PVP favoured the formation of smaller spherical particles (with average diameter below 150 nm). The performances of the electrodes towards the electroreduction of H2O2 were investigated by chronoamperometry at -0.4 V and at more cathodic applied potentials (-0.6 and -0.8 V). The electrodes fabricated without additives and in the presence of NS displayed similar performances, while those fabricated with PVP exhibited significantly lower sensitivity. This suggests that AgNDs present enhanced electrocatalytic activity in respect to the spherical aggregates, since the Ag amount deposited on PGS was practically the same. The best amperometric responses among those recorded at -0.4 V in PBS (pH 6.7) exhibited a linear range extending from 0.1 to 3.5 mM, a detection limit of about 20 μM and a sensitivity close to 200 mA M-1 cm-2. The proposed electrodes display sensitivities which are markedly better than those reported in the literature for similar Ag-based sensors.

  20. Regulatory use of computational toxicology tools and databases at the United States Food and Drug Administration's Office of Food Additive Safety.

    PubMed

    Arvidson, Kirk B; Chanderbhan, Ronald; Muldoon-Jacobs, Kristi; Mayer, Julie; Ogungbesan, Adejoke

    2010-07-01

    Over 10 years ago, the Office of Food Additive Safety (OFAS) in the FDA's Center for Food Safety and Applied Nutrition implemented the formal use of structure-activity relationship analysis and quantitative structure-activity relationship (QSAR) analysis in the premarket review of food-contact substances. More recently, OFAS has implemented the use of multiple QSAR software packages and has begun investigating the use of metabolism data and metabolism predictive models in our QSAR evaluations of food-contact substances. In this article, we provide an overview of the programs used in OFAS as well as a perspective on how to apply multiple QSAR tools in the review process of a new food-contact substance.

  1. Testing microtaphofacies as an analytic tool for integrated facies and sedimentological analysis using Lower Miocene mixed carbonate/siliciclastic sediments from the North Alpine Foreland Basin

    NASA Astrophysics Data System (ADS)

    Nebelsick, James; Bieg, Ulrich

    2010-05-01

    Taphonomic studies have mostly concentrated on the investigation and quantification of isolated macroscopic faunal and floral elements. Carbonate rocks, in contrary to isolated macroscopic objects, have rarely been specifically addressed in terms of taphonomic features, although many aspects of microfacies analyses are directly related to the preservation of constituent biogenic components. There is thus a high potential for analyzing and quantifying taphonomic features in carbonate rocks (microtaphofacies), not the least as an additional tool for facies analysis. Analyzing the role of taphonomy in carbonate environments can be used to determine how different skeletal architectures through time and evolving synecological relationships (bioerosion and encrustation) have influence carbonate environments and their preservation in the rock record. This pilot study analyses the microtaphofacies of Lower Miocene, shallow water, mixed carbonate - siliciclastic environment from the North Alpine Foreland Basin (Molasse Sea) of southern Germany. The sediments range from biogenic bryomol carbonates to pure siliciclastics. This allows environmental interpretation to be made not only with respect to biogenic composition (dominated by bivalves, gastropods, bryozoans and barnacles), but also to siliciclastic grain characteristics and sedimentary features. Facies interpretation is relatively straight forward with a somewhat varied near shore facies distribution characterized dominated by carbonate which grade into higher energy, siliciclastic offshore sediments. Taphonomic features are assessed along this gradient with respect to total component composition as well as by following the trajectories of individual components types. The results are interpreted with respect to biogenic production, fragmentation, abrasion and transport.

  2. A novel ion-pairing chromatographic method for the simultaneous determination of both nicarbazin components in feed additives: chemometric tools for improving the optimization and validation.

    PubMed

    De Zan, María M; Teglia, Carla M; Robles, Juan C; Goicoechea, Héctor C

    2011-07-15

    The development, optimization and validation of an ion-pairing high performance liquid chromatography method for the simultaneous determination of both nicarbazin (NIC) components: 4,4'-dinitrocarbanilide (DNC) and 2-hydroxy-4,6-dimethylpyrimidine (HDP) in bulk materials and feed additives are described. An experimental design was used for the optimization of the chromatographic system. Four variables, including mobile phase composition and oven temperature, were analyzed through a central composite design exploring their contribution to analyte separation. Five responses: peak resolutions, HDP capacity factor, HDP tailing and analysis time, were modelled by using the response surface methodology and were optimized simultaneously by implementing the desirability function. The optimum conditions resulted in a mobile phase consisting of 10.0 mmol L(-1) of 1-heptanesulfonate, 20.0 mmol L(-1) of sodium acetate, pH=3.30 buffer and acetonitrile in a gradient system at a flow rate of 1.00 mL min(-1). Column was an INERSTIL ODS-3 (4.6 mm×150 mm, 5 μm particle size) at 40.0°C. Detection was performed at 300 nm by a diode array detector. The validation results of the method indicated a high selectivity and good precision characteristics, with RSD less than 1.0% for both components, both in intra and inter-assay precision studies. Linearity was proved for a range of 32.0-50.0 μg mL(-1) of NIC in sample solution. The recovery, studied at three different fortification levels, varied from 98.0 to 101.4 for HDP and from 99.1 to 100.2 for DNC. The applicability of the method was demonstrated by determining DNC and HDP content in raw materials and commercial formulations used for coccidiosis prevention. Assays results on real samples showed that considerable differences in molecular ratio DNC:HDP exist among them.

  3. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  4. Analytics for Cyber Network Defense

    SciTech Connect

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  5. The Analytic Process Model for System Design and Measurement: A computer-Aided Tool for Analyzing Training Systems and other Human-Machine Systems

    DTIC Science & Technology

    1985-02-01

    performance measurement; effective- ness measurement; system populations; Bradley Infantry . Fighting Vehicle; BIFV; Analytic Process Model; APM...process model (APM) was developed from earlier models, applied in sample fashions to an existing system (the Bradley Infantry Fighting Vehicle) and...liradley Infantry Fighting Vehicle (Carrier Team Subsystem) 11 6. Example of a System Hierarchical Structure 14 7. Guidelines for Identifying

  6. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  7. Multiplexed Capillary Electrophoresis as Analytical Tool for Fast Optimization of Multi-Enzyme Cascade Reactions - Synthesis of Nucleotide Sugars: Dedicated to Prof. Dr. Vladimir Křen on the occasion of his 60(th) birthday.

    PubMed

    Wahl, Claudia; Hirtz, Dennis; Elling, Lothar

    2016-10-01

    Nucleotide sugars are considered as bottleneck and expensive substrates for enzymatic glycan synthesis using Leloir-glycosyltransferases. Synthesis from cheap substrates such as monosaccharides is accomplished by multi-enzyme cascade reactions. Optimization of product yields in such enzyme modules is dependent on the interplay of multiple parameters of the individual enzymes and governed by a considerable time effort when convential analytic methods like capillary electrophoresis (CE) or HPLC are applied. We here demonstrate for the first time multiplexed CE (MP-CE) as fast analytical tool for the optimization of nucleotide sugar synthesis with multi-enzyme cascade reactions. We introduce a universal separation method for nucleotides and nucleotide sugars enabling us to analyze the composition of six different enzyme modules in a high-throughput format. Optimization of parameters (T, pH, inhibitors, kinetics, cofactors and enzyme amount) employing MP-CE analysis is demonstrated for enzyme modules for the synthesis of UDP-α-D-glucuronic acid (UDP-GlcA) and UDP-α-D-galactose (UDP-Gal). In this way we achieve high space-time-yields: 1.8 g/L⋆h for UDP-GlcA and 17 g/L⋆h for UDP-Gal. The presented MP-CE methodology has the impact to be used as general analytical tool for fast optimization of multi-enzyme cascade reactions.

  8. Isoflurane versus sevoflurane with interscalene block for shoulder arthroscopic procedures: Value of process capability indices as an additional tool for data analysis

    PubMed Central

    Tantry, Thrivikrama Padur; Karanth, Harish; Shenoy, Sunil P; Ayya, Shreekantha V; Shetty, Pramal K; Adappa, Karunakara K

    2016-01-01

    Background and Aims: Hypotensive anaesthesia reduces intra-articular bleed and promotes visualisation during arthroscopy. The haemodynamic effects of inhalational agents isoflurane and sevoflurane were studied extensively, and both were found to reduce mean arterial pressures (MBP) to an equivalent magnitude. We investigated the relative ability of isoflurane vis-a-vis sevoflurane to maintain the target systolic blood pressure (SBP) in patients undergoing shoulder arthroscopic procedures. Methods: In a prospective randomised study, 59 patients in two groups of 30 and 29 patients each received concomitant general anaesthesia (1.2–1.5 MAC of isoflurane and sevoflurane) and interscalene brachial plexus block. Nitrous oxide was used in both groups. Intraoperatively, serial blood pressure recordings of SBP, diastolic blood pressure (DBP), MBP and heart rates were done at every 3rd min intervals. The manipulations needed to achieve target SBP (T = 90 mmHg) for optimal arthroscopic visualisation and treat unacceptable hypotensive episodes were noted. Conventional statistical tests and process capability index (PCI) evaluation were both deployed for data analysis. Results: Lower mean SBP and DBPs were recorded for isoflurane patients as compared to sevoflurane (P < 0.05, for mean, maximum and minimum recordings). Higher mean heart rates were recorded for isoflurane (P < 0.05). PCIs indicated that isoflurane was superior to sevoflurane in the ease of achieving target SBP of 90 mmHg as well as maintaining blood pressures in the range of 80–100 mmHg. Conclusion: Isoflurane provides better intraoperative haemodynamic status vis-a-vis sevoflurane in patients undergoing shoulder arthroscopic surgery with preliminary interscalene blockade. The PCI can be a useful additional medical data analysis tool. PMID:28003697

  9. Visual Analytics Technology Transition Progress

    SciTech Connect

    Scholtz, Jean; Cook, Kristin A.; Whiting, Mark A.; Lemon, Douglas K.; Greenblatt, Howard

    2009-09-23

    The authors provide a description of the transition process for visual analytic tools and contrast this with the transition process for more traditional software tools. This paper takes this into account and describes a user-oriented approach to technology transition including a discussion of key factors that should be considered and adapted to each situation. The progress made in transitioning visual analytic tools in the past five years is described and the challenges that remain are enumerated.

  10. Phase II Fort Ord Landfill Demonstration Task 8 - Refinement of In-line Instrumental Analytical Tools to Evaluate their Operational Utility and Regulatory Acceptance

    SciTech Connect

    Daley, P F

    2006-04-03

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water sampling and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection

  11. Version 1.00 programmer`s tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    SciTech Connect

    Femec, D.A.

    1995-09-01

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.

  12. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  13. Evaluating analytic and risk assessment tools to estimate sediment and nutrients losses from agricultural lands in the southern region of the USA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Non-point source pollution from agricultural fields is a critical problem associated with water quality impairment in the USA and a low-oxygen environment in the Gulf of Mexico. The use, development and enhancement of qualitative and quantitative models or tools for assessing agricultural runoff qua...

  14. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence

  15. Hydrophilic Interaction Chromatography Hyphenated with Mass Spectrometry: A Powerful Analytical Tool for the Comparison of Originator and Biosimilar Therapeutic Monoclonal Antibodies at the Middle-up Level of Analysis.

    PubMed

    D'Atri, Valentina; Fekete, Szabolcs; Beck, Alain; Lauber, Matthew; Guillarme, Davy

    2017-02-07

    The development and approval processes of biosimilar mAbs depend on their comparability to originators. Therefore, analytical comparisons are required to assess structural features and post-translational modifications (PTM) and thereby minimize the risk of being clinically meaningful differences between biosimilar and originator drug products. The glycosylation pattern of mAbs is considered to be an important critical quality attribute (CQA), and several analytical approaches have been proposed that facilitate characterizing and monitoring a glycosylation profile, albeit mainly at a glycan and glycopeptide level of analysis. In this study, we demonstrate the utility of hydrophilic interaction chromatography (HILIC) hyphenated with mass spectrometry (MS) for the qualitative profiling of glycosylation patterns at the protein level, by comparing originator and biosimilars mAbs (Remicade/Remsina/Inflectra, Herceptin/Trastuzumab B, and Erbitux/Cetuximab B) using a middle-up approach. We demonstrate the ability of HILIC to resolve hydrophilic variants of protein biopharmaceuticals at the middle-up level of analysis, its complementarity to reversed phase liquid chromatography, and its hyphenation to MS. HILIC features combined to MS make a powerful analytical tool for the comparison of originator and biosimilar mAbs that could eventually be applied in routine analyses for quality control.

  16. Analytic materials

    NASA Astrophysics Data System (ADS)

    Milton, Graeme W.

    2016-11-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations.

  17. Analytic materials.

    PubMed

    Milton, Graeme W

    2016-11-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90(°) rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations.

  18. The use of aqueous normal phase chromatography as an analytical tool for food analysis: determination of histamine as a model system.

    PubMed

    Dang, Andy; Pesek, Joseph J; Matyska, Maria T

    2013-12-15

    A simple, fast, robust protocol that does not require derivatisation for the determination of histamine, a polar primary bioamine, in red wine and food products is presented. Histamine can be retained and quantified under aqueous normal phase (ANP) conditions, using a Diamond Hydride (DH) column for high performance liquid chromatography/ultraviolet-visible (HPLC-UV) and mass spectrometry (MS) detection. An ANP gradient was developed, allowing for the direct analyses of the wines and food products. The peak shape for this basic compound was also evaluated under optimal analytical conditions. From UV and MS detection, a linear correlation for quantitation is obtained. The basic strategy presented for the analysis of histamine is applicable to a broad range of polar compounds in a variety of foods and beverages.

  19. High-throughput characterization of sediment organic matter by pyrolysis-gas chromatography/mass spectrometry and multivariate curve resolution: A promising analytical tool in (paleo)limnology.

    PubMed

    Tolu, Julie; Gerber, Lorenz; Boily, Jean-François; Bindler, Richard

    2015-06-23

    Molecular-level chemical information about organic matter (OM) in sediments helps to establish the sources of OM and the prevalent degradation/diagenetic processes, both essential for understanding the cycling of carbon (C) and of the elements associated with OM (toxic trace metals and nutrients) in lake ecosystems. Ideally, analytical methods for characterizing OM should allow high sample throughput, consume small amounts of sample and yield relevant chemical information, which are essential for multidisciplinary, high-temporal resolution and/or large spatial scale investigations. We have developed a high-throughput analytical method based on pyrolysis-gas chromatography/mass spectrometry and automated data processing to characterize sedimentary OM in sediments. Our method consumes 200 μg of freeze-dried and ground sediment sample. Pyrolysis was performed at 450°C, which was found to avoid degradation of specific biomarkers (e.g., lignin compounds, fresh carbohydrates/cellulose) compared to 650°C, which is in the range of temperatures commonly applied for environmental samples. The optimization was conducted using the top ten sediment samples of an annually resolved sediment record (containing 16-18% and 1.3-1.9% of total carbon and nitrogen, respectively). Several hundred pyrolytic compound peaks were detected of which over 200 were identified, which represent different classes of organic compounds (i.e., n-alkanes, n-alkenes, 2-ketones, carboxylic acids, carbohydrates, proteins, other N compounds, (methoxy)phenols, (poly)aromatics, chlorophyll and steroids/hopanoids). Technical reproducibility measured as relative standard deviation of the identified peaks in triplicate analyses was 5.5±4.3%, with 90% of the RSD values within 10% and 98% within 15%. Finally, a multivariate calibration model was calculated between the pyrolytic degradation compounds and the sediment depth (i.e., sediment age), which is a function of degradation processes and changes in OM

  20. The color of complexes and UV-vis spectroscopy as an analytical tool of Alfred Werner's group at the University of Zurich.

    PubMed

    Fox, Thomas; Berke, Heinz

    2014-01-01

    Two PhD theses (Alexander Gordienko, 1912; Johannes Angerstein, 1914) and a dissertation in partial fulfillment of a PhD thesis (H. S. French, Zurich, 1914) are reviewed that deal with hitherto unpublished UV-vis spectroscopy work of coordination compounds in the group of Alfred Werner. The method of measurement of UV-vis spectra at Alfred Werner's time is described in detail. Examples of spectra of complexes are given, which were partly interpreted in terms of structure (cis ↔ trans configuration, counting number of bands for structural relationships, and shift of general spectral features by consecutive replacement of ligands). A more complete interpretation of spectra was hampered at Alfred Werner's time by the lack of a light absorption theory and a correct theory of electron excitation, and the lack of a ligand field theory for coordination compounds. The experimentally difficult data acquisitions and the difficult spectral interpretations might have been reasons why this method did not experience a breakthrough in Alfred Werner's group to play a more prominent role as an important analytical method. Nevertheless the application of UV-vis spectroscopy on coordination compounds was unique and novel, and witnesses Alfred Werner's great aptitude and keenness to always try and go beyond conventional practice.

  1. Electrospray ionization mass spectrometry: a key analytical tool for the characterization of regioselectively derivatized maltooligosaccharides obtained starting from natural beta-cyclodextrin.

    PubMed

    Lesur, David; Gassama, Abdoulaye; Moreau, Vincent; Djedaïni-Pilard, Florence; Brique, Arnaud; Pilard, Serge

    2006-01-01

    The development of natural cyclodextrins (CDs) for various industrial applications (agroalimentary, cosmetic or pharmaceutical) constitutes a continuous challenge. For the integration of these agricultural plant products in the creation of super-absorbent biodegradable and hypoallergenic materials (water-retaining agents, cosmetic hydrating and texturing, pharmaceutical and horticultural products) to replace synthetic polymers, we have developed chemical methods to access regioselectively C-6-derivatized maltooligosaccharides starting from CDs. These compounds are highly suitable for further chemical modifications and are expected to give access to a new class of polymeric materials with potential applications such as water-retaining agents in the disposable nappies industry. For the structural analysis of carbohydrates, electrospray ionization mass spectrometry (ESI-MS) offers precise results, analytical versatility and very high sensitivity. We report herein the rapid and convenient follow-up of chemical reactions, the purity evaluation of intermediates and final products, and the structural characterization of derivatized maltooligosaccharides, obtained by acidic cleavage (acetolysis) of halogenated and esterified CDs, using ESI-MS in combination with the high-resolution (HRMS) and tandem mass spectrometry (MS/MS) capabilities of a quadrupole orthogonal time-of-flight (Q-TOF) mass spectrometer.

  2. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for quantitation of Benazepril alone and in combination with Amlodipine.

    PubMed

    Farouk, M; Elaziz, Omar Abd; Tawakkol, Shereen M; Hemdan, A; Shehata, Mostafa A

    2014-04-05

    Four simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the determination of Benazepril (BENZ) alone and in combination with Amlodipine (AML) in pharmaceutical dosage form. The first method is pH induced difference spectrophotometry, where BENZ can be measured in presence of AML as it showed maximum absorption at 237nm and 241nm in 0.1N HCl and 0.1N NaOH, respectively, while AML has no wavelength shift in both solvents. The second method is the new Extended Ratio Subtraction Method (EXRSM) coupled to Ratio Subtraction Method (RSM) for determination of both drugs in commercial dosage form. The third and fourth methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 2-30μg/mL for BENZ in difference and extended ratio subtraction spectrophotometric method, and 5-30 for AML in EXRSM method, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  3. New analytical tools and epidemiological data for the identification of HbA2 borderline subjects in the screening for beta-thalassemia.

    PubMed

    Mosca, Andrea; Paleari, Renata; Galanello, Renzo; Sollaino, Carla; Perseu, Lucia; Demartis, Franca Rosa; Passarello, Cristina; Giambona, Antonino; Maggio, Aurelio

    2008-08-01

    The increase of HbA(2) is the most important feature in the identification of beta-thalassemia carriers. However, some carriers are difficult to identify, because the level of HbA(2) is not in the typical range. Few data are available concerning the prevalence of such unusual phenotypes, and knowing their expected prevalence could be helpful in detecting systematic drifts in the analytical systems for HbA(2) quantification. In this study we report a retrospective investigation in two centres with high prevalence of beta-thalassemia. The prevalence of borderline subjects was found to be 2.2 and 3.0%, respectively. The genotypes of a subgroup of these subjects were then analyzed and in about 25% of cases a mutation in the globin genes was identified. We conclude that the occurrence of HbA(2) borderline phenotypes is not a rare event. In order to obtain more accurate HbA(2) measurements the development of an international reference measurement system for HbA(2), based on quantitative peptide mapping, has been recently started. We believe that the innovative approach of our method could also be used as a model to develop accurate quantitative methods for other red cell proteins relevant to the biodynamic properties and the surface electrochemistry of erythrocytes.

  4. Electrokinetic remediation of soils contaminated by potentially toxic metals: Dedicated analytical tools for assessing the contamination baseline in a complex scenario.

    PubMed

    Ferrucci, Aurelio; Vocciante, Marco; Bagatin, Roberto; Ferro, Sergio

    2017-02-20

    In order to assess the capabilities of a remediation technology, and to judge of its efficacy, it is necessary to evaluate the initial average contamination level of the soil, an operation that can be difficult because of the inhomogeneity of the contamination itself. The goal is even more challenging when different contaminants are present, greatly differing both in terms of nature and of concentration. By referring to an industrial site contaminated mainly by As, Cd, Cu, Pb, Sb, Tl and Zn, we present a new approach for the necessary processing of sampling data, in order to establish the pre-intervention baseline: an estimate of the average contamination has been obtained through a suitable integration of the volume underlying the distribution curve of each contaminating species. This information, otherwise not accessible by means of sampling of discrete points, is useful in evaluating the effectiveness of the remediation technology under investigation, and can also be considered for other reclamation approaches as well. Since "chemometrically acceptable" results are typically achieved by increasing the number of samples (with related analytical investments), the proposed approach can help keep low these ancillary costs, while providing results that are more reliable.

  5. An analytical framework and tool ('InteRa') for integrating the informal recycling sector in waste and resource management systems in developing countries.

    PubMed

    Velis, Costas A; Wilson, David C; Rocca, Ondina; Smith, Stephen R; Mavropoulos, Antonis; Cheeseman, Chris R

    2012-09-01

    In low- and middle-income developing countries, the informal (collection and) recycling sector (here abbreviated IRS) is an important, but often unrecognised, part of a city's solid waste and resources management system. Recent evidence shows recycling rates of 20-30% achieved by IRS systems, reducing collection and disposal costs. They play a vital role in the value chain by reprocessing waste into secondary raw materials, providing a livelihood to around 0.5% of urban populations. However, persisting factual and perceived problems are associated with IRS (waste-picking): occupational and public health and safety (H&S), child labour, uncontrolled pollution, untaxed activities, crime and political collusion. Increasingly, incorporating IRS as a legitimate stakeholder and functional part of solid waste management (SWM) is attempted, further building recycling rates in an affordable way while also addressing the negatives. Based on a literature review and a practitioner's workshop, here we develop a systematic framework--or typology--for classifying and analysing possible interventions to promote the integration of IRS in a city's SWM system. Three primary interfaces are identified: between the IRS and the SWM system, the materials and value chain, and society as a whole; underlain by a fourth, which is focused on organisation and empowerment. To maximise the potential for success, IRS integration/inclusion/formalisation initiatives should consider all four categories in a balanced way and pay increased attention to their interdependencies, which are central to success, including specific actions, such as the IRS having access to source separated waste. A novel rapid evaluation and visualisation tool is presented--integration radar (diagram) or InterRa--aimed at illustrating the degree to which a planned or existing intervention considers each of the four categories. The tool is further demonstrated by application to 10 cases around the world, including a step

  6. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  7. Ion beam analysis and PD-MS as new analytical tools for quality control of pharmaceuticals: comparative study from fluphenazine in solid dosage forms.

    PubMed

    Nsouli, Bilal; Bejjani, Alice; Negra, Serge Della; Gardon, Alain; Thomas, Jean-Paul

    2010-09-01

    In order to evaluate the potential of accelerator based analytical techniques ((particle induced X-ray emission (PIXE), particle induced gamma-ray emission (PIGE), and particle desorption mass spectrometry (PD-MS)) for the analysis of commercial pharmaceutical products in their solid dosage form, the fluphenazine drug has been taken as a representative example. It is demonstrated that PIXE and PIGE are by far the best choice for quantification of the active ingredient (AI) (certification with 7% precision) from the reactions induced on its specific heteroatoms fluorine and sulfur using pellets made from original tablets. Since heteroatoms cannot be present in all types of drugs, the PD-MS technique, which makes easily the distinction between AI(s) and excipients, has been evaluated for the same material. It is shown that the quantification of AI is obtained via the detection of its protonated molecule. However, calibration curves have to be made from the secondary ion yield variations since matrix effects of various nature are characteristics of such mixtures of heterogeneous materials (including deposits from soluble components). From the analysis of solid tablets, (either transformed into pellets and even as received), it is strongly suggested that the physical state of the grains in the mixture is a crucial parameter in the ion emission and accordingly for the calibration curves. As a result of our specific (but not optimized) conditions the resulting precision is <17% with an almost linear range extending from 0.04 to 7.87 mg of AI in a tablet made under the manufacturer conditions (the commercial drug product is labeled at 5 mg).

  8. Analytical tools for identification of non-intentionally added substances (NIAS) coming from polyurethane adhesives in multilayer packaging materials and their migration into food simulants.

    PubMed

    Félix, Juliana S; Isella, Francesca; Bosetti, Osvaldo; Nerín, Cristina

    2012-07-01

    Adhesives used in food packaging to glue different materials can provide several substances as potential migrants, and the identification of potential migrants and migration tests are required to assess safety in the use of adhesives. Solid-phase microextraction in headspace mode and gas chromatography coupled to mass spectrometry (HS-SPME-GC-MS) and ChemSpider and SciFinder databases were used as powerful tools to identify the potential migrants in the polyurethane (PU) adhesives and also in the individual plastic films (polyethylene terephthalate, polyamide, polypropylene, polyethylene, and polyethylene/ethyl vinyl alcohol). Migration tests were carried out by using Tenax(®) and isooctane as food simulants, and the migrants were analyzed by gas chromatography coupled to mass spectrometry. More than 63 volatile and semivolatile compounds considered as potential migrants were detected either in the adhesives or in the films. Migration tests showed two non-intentionally added substances (NIAS) coming from PU adhesives that migrated through the laminates into Tenax(®) and into isooctane. Identification of these NIAS was achieved through their mass spectra, and 1,6-dioxacyclododecane-7,12-dione and 1,4,7-trioxacyclotridecane-8,13-dione were confirmed. Caprolactam migrated into isooctane, and its origin was the external plastic film in the multilayer, demonstrating real diffusion through the multilayer structure. Comparison of the migration values between the simulants and conditions will be shown and discussed.

  9. Analytical testing

    NASA Technical Reports Server (NTRS)

    Flannelly, W. G.; Fabunmi, J. A.; Nagy, E. J.

    1981-01-01

    Analytical methods for combining flight acceleration and strain data with shake test mobility data to predict the effects of structural changes on flight vibrations and strains are presented. This integration of structural dynamic analysis with flight performance is referred to as analytical testing. The objective of this methodology is to analytically estimate the results of flight testing contemplated structural changes with minimum flying and change trials. The category of changes to the aircraft includes mass, stiffness, absorbers, isolators, and active suppressors. Examples of applying the analytical testing methodology using flight test and shake test data measured on an AH-1G helicopter are included. The techniques and procedures for vibration testing and modal analysis are also described.

  10. Number series of atoms, interatomic bonds and interface bonds defining zinc-blende nanocrystals as function of size, shape and surface orientation: Analytic tools to interpret solid state spectroscopy data

    NASA Astrophysics Data System (ADS)

    König, Dirk

    2016-08-01

    Semiconductor nanocrystals (NCs) experience stress and charge transfer by embedding materials or ligands and impurity atoms. In return, the environment of NCs experiences a NC stress response which may lead to matrix deformation and propagated strain. Up to now, there is no universal gauge to evaluate the stress impact on NCs and their response as a function of NC size dNC. I deduce geometrical number series as analytical tools to obtain the number of NC atoms NNC(dNC[i]), bonds between NC atoms Nbnd(dNC[i]) and interface bonds NIF(dNC[i]) for seven high symmetry zinc-blende (zb) NCs with low-index faceting: {001} cubes, {111} octahedra, {110} dodecahedra, {001}-{111} pyramids, {111} tetrahedra, {111}-{001} quatrodecahedra and {001}-{111} quadrodecahedra. The fundamental insights into NC structures revealed here allow for major advancements in data interpretation and understanding of zb- and diamond-lattice based nanomaterials. The analytical number series can serve as a standard procedure for stress evaluation in solid state spectroscopy due to their deterministic nature, easy use and general applicability over a wide range of spectroscopy methods as well as NC sizes, forms and materials.

  11. The cryogenic on-orbit liquid analytical tool (COOLANT) - A computer program for evaluating the thermodynamic performance of orbital cryogen storage facilities

    NASA Technical Reports Server (NTRS)

    Taylor, W. J.; Honkonen, S. C.; Williams, G. E.; Liggett, M. W.; Tucker, S. P.

    1991-01-01

    The United States plans to establish a permanent manned presence at the Space Station Freedom in low earth orbit (LEO) and then carry out exploration of the solar system from this base. These plans may require orbital cryogenic propellant storage depots. The COOLANT program has been developed to analyze the thermodynamic performance of these depots to support design tradeoff studies. It was developed as part of the Long Term Cryogenic Storage Facility Systems Study for NASA/MSFC. This paper discusses the program structure and capabilities of the COOLANT program. In addition, the results of an analysis of a 200,000 lbm hydrogen/oxygen storage depot tankset using COOLANT are presented.

  12. Critical assessment of the Quartz Crystal Microbalance with Dissipation as an analytical tool for biosensor development and fundamental studies: Metallophthalocyanine-glucose oxidase biocomposite sensors.

    PubMed

    Fogel, R; Mashazi, P; Nyokong, T; Limson, J

    2007-08-30

    One of the challenges in electrochemical biosensor design is gaining a fundamental knowledge of the processes underlying immobilisation of the molecules onto the electrode surface. This is of particular importance in biocomposite sensors where concerns have arisen as to the nature of the interaction between the biological and synthetic molecules immobilised. We examined the use of the Quartz Crystal Microbalance with Dissipation (QCM-D) as a tool for fundamental analyses of a model sensor constructed by the immobilisation of cobalt(II) phthalocyanine (TCACoPc) and glucose oxidase (GOx) onto a gold-quartz electrode (electrode surface) for the enhanced detection of glucose. The model sensor was constructed in aqueous phase and covalently linked the gold surface to the TCACoPc, and the TCACoPc to the GOx, using the QCM-D. The aqueous metallophthalocyanine (MPc) formed a multi-layer over the surface of the electrode, which could be removed to leave a monolayer with a mass loading that compared favourably to the theoretical value expected. Analysis of frequency and dissipation plots indicated covalent attachment of glucose oxidase onto the metallophthalocyanine layer. The amount of GOx bound using the model system compared favourably to calculations derived from the maximal amperometric functioning of the electrochemical sensor (examined in previously-published literature, Mashazi, P.N., Ozoemena, K.I., Nyokong, T., 2006. Electrochim. Acta 52, 177-186), but not to theoretical values derived from dimensions of GOx as established by crystallography. The strength of the binding of the GOx film with the TCACoPc layer was tested by using 2% SDS as a denaturant/surfactant, and the GOx film was not found to be significantly affected by exposure to this. This paper thus showed that QCM-D can be used in order to model essential processes and interactions that dictate the functional parameters of a biosensor.

  13. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    EPA Pesticide Factsheets

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  14. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  15. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    PubMed

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  16. Quantification of individual phenolic compounds' contribution to antioxidant capacity in apple: a novel analytical tool based on liquid chromatography with diode array, electrochemical, and charged aerosol detection.

    PubMed

    Plaza, Merichel; Kariuki, James; Turner, Charlotta

    2014-01-15

    Phenolics, particularly from apples, hold great interest because of their antioxidant properties. In the present study, the total antioxidant capacity of different apple extracts obtained by pressurized hot water extraction (PHWE) was determined by cyclic voltammetry (CV), which was compared with the conventional antioxidant assays. To measure the antioxidant capacity of individual antioxidants present in apple extracts, a novel method was developed based on high-performance liquid chromatography (HPLC) with photodiode array (DAD), electrochemical (ECD), and charged aerosol (CAD) detection. HPLC-DAD-ECD-CAD enabled rapid, qualitative, and quantitative determination of antioxidants in the apple extracts. The main advantage of using CAD was that this detector enabled quantification of a large number of phenolics using only a few standards. The results showed that phenolic acids and flavonols were mainly responsible for the total antioxidant capacity of apple extracts. In addition, protocatechuic acid, chlorogenic acid, hyperoside, an unidentified phenolic acid, and a quercetin derivative presented the highest antioxidant capacities.

  17. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  18. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    PubMed

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  19. Process analytical applications of Raman spectroscopy.

    PubMed

    Rantanen, Jukka

    2007-02-01

    There is an increasing demand for new approaches to understand the chemical and physical phenomena that occur during pharmaceutical unit operations. Obtaining real-time information from processes opens new perspectives for safer and more efficient manufacture of pharmaceuticals. Raman spectroscopy provides a molecular level insight into processing, and therefore it is a future process analytical tool. In this review, different applications of Raman spectroscopy in the field of process analysis of pharmaceutical solid dosage forms are summarized. In addition, pitfalls associated with interfacing to the process environment and challenges within data management are discussed.

  20. Nanomaterials as Analytical Tools for Genosensors

    PubMed Central

    Abu-Salah, Khalid M.; Alrokyan, Salman A.; Khan, Muhammad Naziruddin; Ansari, Anees Ahmad

    2010-01-01

    Nanomaterials are being increasingly used for the development of electrochemical DNA biosensors, due to the unique electrocatalytic properties found in nanoscale materials. They offer excellent prospects for interfacing biological recognition events with electronic signal transduction and for designing a new generation of bioelectronic devices exhibiting novel functions. In particular, nanomaterials such as noble metal nanoparticles (Au, Pt), carbon nanotubes (CNTs), magnetic nanoparticles, quantum dots and metal oxide nanoparticles have been actively investigated for their applications in DNA biosensors, which have become a new interdisciplinary frontier between biological detection and material science. In this article, we address some of the main advances in this field over the past few years, discussing the issues and challenges with the aim of stimulating a broader interest in developing nanomaterial-based biosensors and improving their applications in disease diagnosis and food safety examination. PMID:22315580

  1. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  2. An Eight-Eyed Version of Hawkins and Shohet's Clinical Supervision Model: The Addition of the Cognitive Analytic Therapy Concept of the "Observing Eye/I" as the "Observing Us"

    ERIC Educational Resources Information Center

    Darongkamas, Jurai; John, Christopher; Walker, Mark James

    2014-01-01

    This paper proposes incorporating the concept of the "observing eye/I", from cognitive analytic therapy (CAT), to Hawkins and Shohet's seven modes of supervision, comprising their transtheoretical model of supervision. Each mode is described alongside explicit examples relating to CAT. This modification using a key idea from CAT (in…

  3. Correlated Raman micro-spectroscopy and scanning electron microscopy analyses of flame retardants in environmental samples: a micro-analytical tool for probing chemical composition, origin and spatial distribution.

    PubMed

    Ghosal, Sutapa; Wagner, Jeff

    2013-07-07

    We present correlated application of two micro-analytical techniques: scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) and Raman micro-spectroscopy (RMS) for the non-invasive characterization and molecular identification of flame retardants (FRs) in environmental dusts and consumer products. The SEM/EDS-RMS technique offers correlated, morphological, molecular, spatial distribution and semi-quantitative elemental concentration information at the individual particle level with micrometer spatial resolution and minimal sample preparation. The presented methodology uses SEM/EDS analyses for rapid detection of particles containing FR specific elements as potential indicators of FR presence in a sample followed by correlated RMS analyses of the same particles for characterization of the FR sub-regions and surrounding matrices. The spatially resolved characterization enabled by this approach provides insights into the distributional heterogeneity as well as potential transfer and exposure mechanisms for FRs in the environment that is typically not available through traditional FR analysis. We have used this methodology to reveal a heterogeneous distribution of highly concentrated deca-BDE particles in environmental dust, sometimes in association with identifiable consumer materials. The observed coexistence of deca-BDE with consumer material in dust is strongly indicative of its release into the environment via weathering/abrasion of consumer products. Ingestion of such enriched FR particles in dust represents a potential for instantaneous exposure to high FR concentrations. Therefore, correlated SEM/RMS analysis offers a novel investigative tool for addressing an area of important environmental concern.

  4. Predictive analytics can support the ACO model.

    PubMed

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  5. GRIPPING TOOL

    DOEpatents

    Sandrock, R.J.

    1961-12-12

    A self-actuated gripping tool is described for transferring fuel elements and the like into reactors and other inaccessible locations. The tool will grasp or release the load only when properly positioned for this purpose. In addition, the load cannot be released except when unsupported by the tool, so that jarring or contact will not bring about accidental release of the load. The gripping members or jaws of the device are cam-actuated by an axially slidable shaft which has two lockable positions. A spring urges the shaft into one position and a solenoid is provided to overcome the spring and move it into the other position. The weight of the tool operates a sleeve to lock the shaft in its existing position. Only when the cable supporting the tool is slack is the device capable of being actuated either to grasp or release its load. (AEC)

  6. An overview of city analytics

    PubMed Central

    Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter

    2017-01-01

    We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454

  7. Laboratory, Field, and Analytical Procedures for Using ...

    EPA Pesticide Factsheets

    Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas

  8. Competing on analytics.

    PubMed

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  9. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  10. Tool Changer For Robot

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.

    1992-01-01

    Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.

  11. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  12. Fabricating Cotton Analytical Devices.

    PubMed

    Lin, Shang-Chi; Hsu, Min-Yen; Kuan, Chen-Meng; Tseng, Fan-Gang; Cheng, Chao-Min

    2016-08-30

    A robust, low-cost analytical device should be user-friendly, rapid, and affordable. Such devices should also be able to operate with scarce samples and provide information for follow-up treatment. Here, we demonstrate the development of a cotton-based urinalysis (i.e., nitrite, total protein, and urobilinogen assays) analytical device that employs a lateral flow-based format, and is inexpensive, easily fabricated, rapid, and can be used to conduct multiple tests without cross-contamination worries. Cotton is composed of cellulose fibers with natural absorptive properties that can be leveraged for flow-based analysis. The simple but elegant fabrication process of our cotton-based analytical device is described in this study. The arrangement of the cotton structure and test pad takes advantage of the hydrophobicity and absorptive strength of each material. Because of these physical characteristics, colorimetric results can persistently adhere to the test pad. This device enables physicians to receive clinical information in a timely manner and shows great potential as a tool for early intervention.

  13. Nanomaterial Drug Products: Manufacturing and Analytical Perspectives.

    PubMed

    Sayes, Christie M; Aquino, Grace V; Hickey, Anthony J

    2017-01-01

    The increasing use of nanotechnology, including nanoparticles, in the preparation of drug products requires both manufacturing and analytical considerations in order to establish the quality metrics suitable for performance and risk assessment. A range of different nanoparticle systems exists including (but not limited to) nano-drugs, nano-additives, and nano-carriers. These systems generally require more complex production and characterization strategies than conventional pharmaceutical dosage forms. The advantage of using nanoparticle systems in pharmaceutical science is that the effective and desired function of the material can be designed through modern manufacturing processes. This paper offers a systematic nomenclature which allows for greater understanding of the drug product under evaluation based on available data from other nanoparticle reports. Analytical considerations of nano-drugs, nano-additives, and nano-carriers and the way in which they are measured are directly connected to quality control. Ultimately, the objective is to consider the entire nano-drug, nano-additive, and nano-carrier product life cycle with respect to its manufacture, use, and eventual fate. The tools and approaches to address the needs of these products exist; it should be the task of the pharmaceutical scientists and those in related disciplines to increase their understanding of nanomedicine and its novel products.

  14. Collaborative Analytical Toolbox version 1.0

    SciTech Connect

    2008-08-21

    The purpose of the Collaborative Analytical Toolbox (CAT) is to provide a comprehensive, enabling, collaborative problem solving environment that enables users to more effectively apply and improve their analytical and problem solving capabilities. CAT is a software framework for integrating other tools and data sources. It includes a set of core services for collaboration and information exploration and analysis, and a framework that facilitates quickly integrating new ideas, techniques, and tools with existing data sources.

  15. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  16. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  17. Food additives

    MedlinePlus

    ... or natural. Natural food additives include: Herbs or spices to add flavor to foods Vinegar for pickling ... Certain colors improve the appearance of foods. Many spices, as well as natural and man-made flavors, ...

  18. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  19. Pre-Exposure Prophylaxis (PrEP) as an Additional Tool for HIV Prevention Among Men Who Have Sex With Men in Belgium: The Be-PrEP-ared Study Protocol

    PubMed Central

    Nöstlinger, Christiana; Wouters, Kristien; Fransen, Katrien; Crucitti, Tania; Kenyon, Chris; Buyze, Jozefien; Schurmans, Céline; Laga, Marie; Vuylsteke, Bea

    2017-01-01

    Background Pre-exposure prophylaxis (PrEP) is a promising and effective tool to prevent HIV. With the approval of Truvada as daily PrEP by the European Commission in August 2016, individual European Member states prepare themselves for PrEP implementation following the examples of France and Norway. However, context-specific data to guide optimal implementation is currently lacking. Objective With this demonstration project we evaluate whether daily and event-driven PrEP, provided within a comprehensive prevention package, is a feasible and acceptable additional prevention tool for men who have sex with men (MSM) at high risk of acquiring HIV in Belgium. The study’s primary objective is to document the uptake, acceptability, and adherence to both daily and event-driven PrEP, while several secondary objectives have been formulated including impact of PrEP use on sexual behavior. Methods The Be-PrEP-ared study is a phase 3, single-site, open-label prospective cohort study with a large social science component embedded in the trial. A total of 200 participants choose between daily or event-driven PrEP use and may switch, discontinue, or restart their regimen at the 3-monthly visits for a duration of 18 months. Data are collected on several platforms: an electronic case report form, a Web-based tool where participants register their sexual behavior and pill use, a more detailed electronic self-administered questionnaire completed during study visits on a tablet computer, and in-depth interviews among a selected sample of participants. To answer the primary objective, the recruitment rate, (un)safe sex behavior during the last 6 months, percentage of reported intention to use PrEP in the future, retention rates in different regimens, and attitudes towards PrEP use will be analyzed. Adherence will be monitored using self-reported adherence, pill count, tenofovir drug levels in blood samples, and the perceived skills to adhere. Results All participants are currently

  20. TOOLS AND METHODS FOR POLLUTION PREVENTION

    EPA Science Inventory

    The design tools for pollution prevention can be described to fall into three categories: 1. Analytical Tools; 2. Process Tools; 3. Economic Tools. Ideally, these three types of tools would be needed to make sound environmental decisions. Frequently, however, decisions will b...

  1. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    Business Analytics, Decision Analytics, Business Intelligence, Advanced Analytics, Data Science. . . to a certain degree, to label is to limit - if only... Business Analytics. 2004 2006 2008 2010 2012 2014 Figure 1: Google trending of daily searches for various analytic disciplines “The limits of my

  2. The MCNP6 Analytic Criticality Benchmark Suite

    SciTech Connect

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling) and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.

  3. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  4. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  5. Process Recovery after CaO Addition Due to Granule Formation in a CSTR Co-Digester-A Tool to Influence the Composition of the Microbial Community and Stabilize the Process?

    PubMed

    Liebrich, Marietta; Kleyböcker, Anne; Kasina, Monika; Miethling-Graff, Rona; Kassahun, Andrea; Würdemann, Hilke

    2016-03-17

    The composition, structure and function of granules formed during process recovery with calcium oxide in a laboratory-scale fermenter fed with sewage sludge and rapeseed oil were studied. In the course of over-acidification and successful process recovery, only minor changes were observed in the bacterial community of the digestate, while granules appeared during recovery. Fluorescence microscopic analysis of the granules showed a close spatial relationship between calcium and oil and/or long chain fatty acids. This finding further substantiated the hypothesis that calcium precipitated with carbon of organic origin and reduced the negative effects of overloading with oil. Furthermore, the enrichment of phosphate minerals in the granules was shown, and molecular biological analyses detected polyphosphate-accumulating organisms as well as methanogenic archaea in the core. Organisms related to Methanoculleus receptaculi were detected in the inner zones of a granule, whereas they were present in the digestate only after process recovery. This finding indicated more favorable microhabitats inside the granules that supported process recovery. Thus, the granule formation triggered by calcium oxide addition served as a tool to influence the composition of the microbial community and to stabilize the process after overloading with oil.

  6. Process Recovery after CaO Addition Due to Granule Formation in a CSTR Co-Digester—A Tool to Influence the Composition of the Microbial Community and Stabilize the Process?

    PubMed Central

    Liebrich, Marietta; Kleyböcker, Anne; Kasina, Monika; Miethling-Graff, Rona; Kassahun, Andrea; Würdemann, Hilke

    2016-01-01

    The composition, structure and function of granules formed during process recovery with calcium oxide in a laboratory-scale fermenter fed with sewage sludge and rapeseed oil were studied. In the course of over-acidification and successful process recovery, only minor changes were observed in the bacterial community of the digestate, while granules appeared during recovery. Fluorescence microscopic analysis of the granules showed a close spatial relationship between calcium and oil and/or long chain fatty acids. This finding further substantiated the hypothesis that calcium precipitated with carbon of organic origin and reduced the negative effects of overloading with oil. Furthermore, the enrichment of phosphate minerals in the granules was shown, and molecular biological analyses detected polyphosphate-accumulating organisms as well as methanogenic archaea in the core. Organisms related to Methanoculleus receptaculi were detected in the inner zones of a granule, whereas they were present in the digestate only after process recovery. This finding indicated more favorable microhabitats inside the granules that supported process recovery. Thus, the granule formation triggered by calcium oxide addition served as a tool to influence the composition of the microbial community and to stabilize the process after overloading with oil. PMID:27681911

  7. Analytical Chemistry of Nitric Oxide

    PubMed Central

    Hetrick, Evan M.

    2013-01-01

    Nitric oxide (NO) is the focus of intense research, owing primarily to its wide-ranging biological and physiological actions. A requirement for understanding its origin, activity, and regulation is the need for accurate and precise measurement techniques. Unfortunately, analytical assays for monitoring NO are challenged by NO’s unique chemical and physical properties, including its reactivity, rapid diffusion, and short half-life. Moreover, NO concentrations may span pM to µM in physiological milieu, requiring techniques with wide dynamic response ranges. Despite such challenges, many analytical techniques have emerged for the detection of NO. Herein, we review the most common spectroscopic and electrochemical methods, with special focus on the fundamentals behind each technique and approaches that have been coupled with modern analytical measurement tools or exploited to create novel NO sensors. PMID:20636069

  8. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  9. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  10. Analytical chemistry of nitric oxide.

    PubMed

    Hetrick, Evan M; Schoenfisch, Mark H

    2009-01-01

    Nitric oxide (NO) is the focus of intense research primarily because of its wide-ranging biological and physiological actions. To understand its origin, activity, and regulation, accurate and precise measurement techniques are needed. Unfortunately, analytical assays for monitoring NO are challenged by NO's unique chemical and physical properties, including its reactivity, rapid diffusion, and short half-life. Moreover, NO concentrations may span the picomolar-to-micromolar range in physiological milieus, requiring techniques with wide dynamic response ranges. Despite such challenges, many analytical techniques have emerged for the detection of NO. Herein, we review the most common spectroscopic and electrochemical methods, with a focus on the underlying mechanism of each technique and on approaches that have been coupled with modern analytical measurement tools to create novel NO sensors.

  11. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  12. An Analysis of Earth Science Data Analytics Use Cases

    NASA Astrophysics Data System (ADS)

    Shie, C. L.; Kempler, S. J.

    2015-12-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https://earthdata.nasa.gov/about/system-performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co-analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  13. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  14. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  15. Let's Talk... Analytics

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  16. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient.

  17. hydropower biological evaluation tools

    SciTech Connect

    2016-10-06

    This software is a set of analytical tools to evaluate the physical and biological performance of existing, refurbished, or newly installed conventional hydro-turbines nationwide where fish passage is a regulatory concern. The current version is based on information collected by the Sensor Fish. Future version will include other technologies. The tool set includes data acquisition, data processing, and biological response tools with applications to various turbine designs and other passage alternatives. The associated database is centralized, and can be accessed remotely. We have demonstrated its use for various applications including both turbines and spillways

  18. Nutritional lipidomics: molecular metabolism, analytics, and diagnostics.

    PubMed

    Smilowitz, Jennifer T; Zivkovic, Angela M; Wan, Yu-Jui Yvonne; Watkins, Steve M; Nording, Malin L; Hammock, Bruce D; German, J Bruce

    2013-08-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of MS with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative, and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions, and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways.

  19. Nutritional Lipidomics: Molecular Metabolism, Analytics, and Diagnostics

    PubMed Central

    Smilowitz, Jennifer T.; Zivkovic, Angela M.; Wan, Yu-Jui Yvonne; Watkins, Steve M.; Nording, Malin L.; Hammock, Bruce D.; German, J. Bruce

    2013-01-01

    The field of lipidomics is providing nutritional science a more comprehensive view of lipid intermediates. Lipidomics research takes advantage of the increase in accuracy and sensitivity of mass detection of mass spectrometry with new bioinformatics toolsets to characterize the structures and abundances of complex lipids. Yet, translating lipidomics to practice via nutritional interventions is still in its infancy. No single instrumentation platform is able to solve the varying analytical challenges of the different molecular lipid species. Biochemical pathways of lipid metabolism remain incomplete and the tools to map lipid compositional data to pathways are still being assembled. Biology itself is dauntingly complex and simply separating biological structures remains a key challenge to lipidomics. Nonetheless, the strategy of combining tandem analytical methods to perform the sensitive, high-throughput, quantitative and comprehensive analysis of lipid metabolites of very large numbers of molecules is poised to drive the field forward rapidly. Among the next steps for nutrition to understand the changes in structures, compositions and function of lipid biomolecules in response to diet is to describe their distribution within discrete functional compartments-lipoproteins. Additionally, lipidomics must tackle the task of assigning the functions of lipids as signaling molecules, nutrient sensors, and intermediates of metabolic pathways. PMID:23818328

  20. MATLAB/Simulink analytic radar modeling environment

    NASA Astrophysics Data System (ADS)

    Esken, Bruce L.; Clayton, Brian L.

    2001-09-01

    Analytic radar models are simulations based on abstract representations of the radar, the RF environment that radar signals are propagated, and the reflections produced by targets, clutter and multipath. These models have traditionally been developed in FORTRAN and have evolved over the last 20 years into efficient and well-accepted codes. However, current models are limited in two primary areas. First, by the nature of algorithm based analytical models, they can be difficult to understand by non-programmers and equally difficult to modify or extend. Second, there is strong interest in re-using these models to support higher-level weapon system and mission level simulations. To address these issues, a model development approach has been demonstrated which utilizes the MATLAB/Simulink graphical development environment. Because the MATLAB/Simulink environment graphically represents model algorithms - thus providing visibility into the model - algorithms can be easily analyzed and modified by engineers and analysts with limited software skills. In addition, software tools have been created that provide for the automatic code generation of C++ objects. These objects are created with well-defined interfaces enabling them to be used by modeling architectures external to the MATLAB/Simulink environment. The approach utilized is generic and can be extended to other engineering fields.

  1. A review of the analytical simulation of aircraft crash dynamics

    NASA Technical Reports Server (NTRS)

    Fasanella, Edwin L.; Carden, Huey D.; Boitnott, Richard L.; Hayduk, Robert J.

    1990-01-01

    A large number of full scale tests of general aviation aircraft, helicopters, and one unique air-to-ground controlled impact of a transport aircraft were performed. Additionally, research was also conducted on seat dynamic performance, load-limiting seats, load limiting subfloor designs, and emergency-locator-transmitters (ELTs). Computer programs were developed to provide designers with methods for predicting accelerations, velocities, and displacements of collapsing structure and for estimating the human response to crash loads. The results of full scale aircraft and component tests were used to verify and guide the development of analytical simulation tools and to demonstrate impact load attenuating concepts. Analytical simulation of metal and composite aircraft crash dynamics are addressed. Finite element models are examined to determine their degree of corroboration by experimental data and to reveal deficiencies requiring further development.

  2. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  3. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  4. [The Raman Spectroscopy (RS): A new tool for the analytical quality control of injectable in health settings. Comparison of RS technique versus HPLC and UV/Vis-FTIR, applied to anthracyclines as anticancer drugs].

    PubMed

    Bourget, P; Amin, A; Moriceau, A; Cassard, B; Vidal, F; Clement, R

    2012-12-01

    The study compares the performances of three analytical methods devoted to Analytical Quality Control (AQC) of therapeutic solutions formed into care environment, we are talking about Therapeutics Objects(TN) (TOs(TN)). We explored the pharmacological model of two widely used anthracyclines i.e. adriamycin and epirubicin. We compared the performance of the HPLC versus two vibrational spectroscopic techniques: a tandem UV/Vis-FTIR on one hand and Raman Spectroscopy (RS) on the other. The three methods give good results for the key criteria of repeatability, of reproducibility and, of accuracy. A Spearman and a Kendall correlation test confirms the noninferiority of the vibrational techniques as an alternative to the reference method (HPLC). The selection of bands for characterization and quantification by RS is the results of a gradual process adjustment, at the intercept of matrix effects. From the perspective of a AQC associated to release of TOs, RS displays various advantages: (a) to decide quickly (~2min), simultaneously and without intrusion or withdrawal on both the nature of a packaging than on a solvant and this, regardless of the compound of interest; it is the founder asset of the method, (b) to explore qualitatively and quantitatively any kinds of TOs, (c) operator safety is guaranteed during production and in the laboratory, (d) the suppression of analytical releases or waste contribute to protects the environment, (e) the suppression.of consumables, (f) a negligible costs of maintenance, (g) a small budget of technicians training. These results already show that the SR technology is potentially a strong contributor to the safety of the medication cycle and fight against the iatrogenic effects of drugs.

  5. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  6. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    SciTech Connect

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.; Christel, Michael; Ribarsky, Martin W.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  7. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  8. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  9. Analyticity without Differentiability

    ERIC Educational Resources Information Center

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  10. SNL software manual for the ACS Data Analytics Project.

    SciTech Connect

    Stearley, Jon R.; McLendon, William Clarence, III; Rodrigues, Arun F.; Williams, Aaron S.; Hooper, Russell Warren; Robinson, David Gerald; Stickland, Michael G.

    2011-10-01

    In the ACS Data Analytics Project (also known as 'YumYum'), a supercomputer is modeled as a graph of components and dependencies, jobs and faults are simulated, and component fault rates are estimated using the graph structure and job pass/fail outcomes. This report documents the successful completion of all SNL deliverables and tasks, describes the software written by SNL for the project, and presents the data it generates. Readers should understand what the software tools are, how they fit together, and how to use them to reproduce the presented data and additional experiments as desired. The SNL YumYum tools provide the novel simulation and inference capabilities desired by ACS. SNL also developed and implemented a new algorithm, which provides faster estimates, at finer component granularity, on arbitrary directed acyclic graphs.

  11. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  12. Finite element analyses of tool stresses in metal cutting processes

    SciTech Connect

    Kistler, B.L.

    1997-01-01

    In this report, we analytically predict and examine stresses in tool tips used in high speed orthogonal machining operations. Specifically, one analysis was compared to an existing experimental measurement of stresses in a sapphire tool tip cutting 1020 steel at slow speeds. In addition, two analyses were done of a carbide tool tip in a machining process at higher cutting speeds, in order to compare to experimental results produced as part of this study. The metal being cut was simulated using a Sandia developed damage plasticity material model, which allowed the cutting to occur analytically without prespecifying the line of cutting/failure. The latter analyses incorporated temperature effects on the tool tip. Calculated tool forces and peak stresses matched experimental data to within 20%. Stress contours generally agreed between analysis and experiment. This work could be extended to investigate/predict failures in the tool tip, which would be of great interest to machining shops in understanding how to optimize cost/retooling time.

  13. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  14. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    -telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  15. Jetting tool

    SciTech Connect

    Szarka, D.D.; Schwegman, S.L.

    1991-07-09

    This patent describes an apparatus for hydraulically jetting a well tool disposed in a well, the well tool having a sliding member. It comprises positioner means for operably engaging the sliding member of the well tool; and a jetting means, connected at a rotatable connection to the positioner means so that the jetting means is rotatable relative to the positioner means and the well tool, for hydraulically jetting the well tool as the jetting means is rotated relative thereto.

  16. Risk Assessment Tools

    DTIC Science & Technology

    1994-10-01

    2W0 ww) A number of computer-based risk assessment tools were enhanced or creaited to provide Increased access to risk assessment instruments and...produced an extensible authoring tool , SYNTAS, for test instruments that will simplify the data gathering phase of subsequent work. SYNTAS gives DNA...Ultimately it became a computer-assisted software engineerting (CASE) tool capable of producing a wide variety of assessment instruments . In addition, its

  17. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  18. Java Tool Retirement

    Atmospheric Science Data Center

    2014-05-15

    Date(s):  Wednesday, May 14, 2014 Time:  08:00 am EDT Event Impact:  The ASDC Java Order Tool was officially retired on Wednesday, May 14, 2014.  The HTML Order Tool and additional options are available...

  19. Analytical Chemistry in Russia.

    PubMed

    Zolotov, Yuri

    2016-09-06

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  20. Constraint-Referenced Analytics of Algebra Learning

    ERIC Educational Resources Information Center

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  1. Standards for business analytics and departmental workflow.

    PubMed

    Erickson, Bradley J; Meenan, Christopher; Langer, Steve

    2013-02-01

    Efficient workflow is essential for a successful business. However, there is relatively little literature on analytical tools and standards for defining workflow and measuring workflow efficiency. Here, we describe an effort to define a workflow lexicon for medical imaging departments, including the rationale, the process, and the resulting lexicon.

  2. Science Update: Analytical Chemistry.

    ERIC Educational Resources Information Center

    Worthy, Ward

    1980-01-01

    Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)

  3. Toward Theoretical Techniques for Measuring the Use of Human Effort in Visual Analytic Systems.

    PubMed

    Crouser, R Jordan; Franklin, Lyndsey; Endert, Alex; Cook, Kris

    2017-01-01

    Visual analytic systems have long relied on user studies and standard datasets to demonstrate advances to the state of the art, as well as to illustrate the efficiency of solutions to domain-specific challenges. This approach has enabled some important comparisons between systems, but unfortunately the narrow scope required to facilitate these comparisons has prevented many of these lessons from being generalized to new areas. At the same time, advanced visual analytic systems have made increasing use of human-machine collaboration to solve problems not tractable by machine computation alone. To continue to make progress in modeling user tasks in these hybrid visual analytic systems, we must strive to gain insight into what makes certain tasks more complex than others. This will require the development of mechanisms for describing the balance to be struck between machine and human strengths with respect to analytical tasks and workload. In this paper, we argue for the necessity of theoretical tools for reasoning about such balance in visual analytic systems and demonstrate the utility of the Human Oracle Model for this purpose in the context of sensemaking in visual analytics. Additionally, we make use of the Human Oracle Model to guide the development of a new system through a case study in the domain of cybersecurity.

  4. Methodology for the validation of analytical methods involved in uniformity of dosage units tests.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2013-01-14

    Validation of analytical methods is required prior to their routine use. In addition, the current implementation of the Quality by Design (QbD) framework in the pharmaceutical industries aims at improving the quality of the end products starting from its early design stage. However, no regulatory guideline or none of the published methodologies to assess method validation propose decision methodologies that effectively take into account the final purpose of developed analytical methods. In this work a solution is proposed for the specific case of validating analytical methods involved in the assessment of the content uniformity or uniformity of dosage units of a batch of pharmaceutical drug products as proposed in the European or US pharmacopoeias. This methodology uses statistical tolerance intervals as decision tools. Moreover it adequately defines the Analytical Target Profile of analytical methods in order to obtain analytical methods that allow to make correct decisions about Content uniformity or uniformity of dosage units with high probability. The applicability of the proposed methodology is further illustrated using an HPLC-UV assay as well as a near infra-red spectrophotometric method.

  5. Analyte-Responsive Hydrogels: Intelligent Materials for Biosensing and Drug Delivery.

    PubMed

    Culver, Heidi R; Clegg, John R; Peppas, Nicholas A

    2017-02-21

    Nature has mastered the art of molecular recognition. For example, using synergistic non-covalent interactions, proteins can distinguish between molecules and bind a partner with incredible affinity and specificity. Scientists have developed, and continue to develop, techniques to investigate and better understand molecular recognition. As a consequence, analyte-responsive hydrogels that mimic these recognitive processes have emerged as a class of intelligent materials. These materials are unique not only in the type of analyte to which they respond but also in how molecular recognition is achieved and how the hydrogel responds to the analyte. Traditional intelligent hydrogels can respond to environmental cues such as pH, temperature, and ionic strength. The functional monomers used to make these hydrogels can be varied to achieve responsive behavior. For analyte-responsive hydrogels, molecular recognition can also be achieved by incorporating biomolecules with inherent molecular recognition properties (e.g., nucleic acids, peptides, enzymes, etc.) into the polymer network. Furthermore, in addition to typical swelling/syneresis responses, these materials exhibit unique responsive behaviors, such as gel assembly or disassembly, upon interaction with the target analyte. With the diverse tools available for molecular recognition and the ability to generate unique responsive behaviors, analyte-responsive hydrogels have found great utility in a wide range of applications. In this Account, we discuss strategies for making four different classes of analyte-responsive hydrogels, specifically, non-imprinted, molecularly imprinted, biomolecule-containing, and enzymatically responsive hydrogels. Then we explore how these materials have been incorporated into sensors and drug delivery systems, highlighting examples that demonstrate the versatility of these materials. For example, in addition to the molecular recognition properties of analyte-responsive hydrogels, the

  6. OOTW Force Design Tools

    SciTech Connect

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  7. Tool Carrier

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Tool organizer accommodates a selection of hand tools on a waist or thigh belt or alternately on wall, work bench, or car trunk mountings. Tool caddy is widely used by industrial maintenance personnel, TV technicians, mechanics, artists, draftsmen, hobbyists and homeowners. Its innovative feature is rows of flexible vinyl "fingers" like the bristles of a hairbrush which mesh together to hold the tool securely in place yet allow easy insertion or withdrawal. Product is no longer commercially available.

  8. Molecular tools for chemical biotechnology

    PubMed Central

    Galanie, Stephanie; Siddiqui, Michael S.; Smolke, Christina D.

    2013-01-01

    Biotechnological production of high value chemical products increasingly involves engineering in vivo multi-enzyme pathways and host metabolism. Recent approaches to these engineering objectives have made use of molecular tools to advance de novo pathway identification, tunable enzyme expression, and rapid pathway construction. Molecular tools also enable optimization of single enzymes and entire genomes through diversity generation and screening, whole cell analytics, and synthetic metabolic control networks. In this review, we focus on advanced molecular tools and their applications to engineered pathways in host organisms, highlighting the degree to which each tool is generalizable. PMID:23528237

  9. Analytical aspects of hydrogen exchange mass spectrometry

    PubMed Central

    Engen, John R.; Wales, Thomas E.

    2016-01-01

    The analytical aspects of measuring hydrogen exchange by mass spectrometry are reviewed. The nature of analytical selectivity in hydrogen exchange is described followed by review of the analytical tools required to accomplish fragmentation, separation, and the mass spectrometry measurements under restrictive exchange quench conditions. In contrast to analytical quantitation that relies on measurements of peak intensity or area, quantitation in hydrogen exchange mass spectrometry depends on measuring a mass change with respect to an undeuterated or deuterated control, resulting in a value between zero and the maximum amount of deuterium that could be incorporated. Reliable quantitation is a function of experimental fidelity and to achieve high measurement reproducibility, a large number of experimental variables must be controlled during sample preparation and analysis. The method also reports on important qualitative aspects of the sample, including conformational heterogeneity and population dynamics. PMID:26048552

  10. Percussion tool

    DOEpatents

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  11. Signals: Applying Academic Analytics

    ERIC Educational Resources Information Center

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  12. Extreme Scale Visual Analytics

    SciTech Connect

    Wong, Pak C.; Shen, Han-Wei; Pascucci, Valerio

    2012-05-08

    Extreme-scale visual analytics (VA) is about applying VA to extreme-scale data. The articles in this special issue examine advances related to extreme-scale VA problems, their analytical and computational challenges, and their real-world applications.

  13. Learning Analytics Considered Harmful

    ERIC Educational Resources Information Center

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  14. Analytical mass spectrometry

    SciTech Connect

    Not Available

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  15. Analytical mass spectrometry. Abstracts

    SciTech Connect

    Not Available

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  16. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  17. New and emerging analytical techniques for marine biotechnology.

    PubMed

    Burgess, J Grant

    2012-02-01

    Marine biotechnology is the industrial, medical or environmental application of biological resources from the sea. Since the marine environment is the most biologically and chemically diverse habitat on the planet, marine biotechnology has, in recent years delivered a growing number of major therapeutic products, industrial and environmental applications and analytical tools. These range from the use of a snail toxin to develop a pain control drug, metabolites from a sea squirt to develop an anti-cancer therapeutic, and marine enzymes to remove bacterial biofilms. In addition, well known and broadly used analytical techniques are derived from marine molecules or enzymes, including green fluorescence protein gene tagging methods and heat resistant polymerases used in the polymerase chain reaction. Advances in bacterial identification, metabolic profiling and physical handling of cells are being revolutionised by techniques such as mass spectrometric analysis of bacterial proteins. Advances in instrumentation and a combination of these physical advances with progress in proteomics and bioinformatics are accelerating our ability to harness biology for commercial gain. Single cell Raman spectroscopy and microfluidics are two emerging techniques which are also discussed elsewhere in this issue. In this review, we provide a brief survey and update of the most powerful and rapidly growing analytical techniques as used in marine biotechnology, together with some promising examples of less well known earlier stage methods which may make a bigger impact in the future.

  18. Analytical connotations of point-of-care testing.

    PubMed

    Aguilera-Herrador, Eva; Cruz-Vera, Marta; Valcárcel, Miguel

    2010-09-01

    The main objective of this Tutorial Review is to approach the modern principles and practices of Analytical Chemistry to Point-of-Care Testing (POCT) systems in order to contribute to improve both the development of new devices and the reliable application of the existing ones. In this article, after contextualization of the topic, POCT systems (POCTs) are fully defined using several approaches. The requirements of a POCT system to be a robust and reliable tool available to patients and medical workers are described as well as their desirable complementary characteristics. In addition, the technical components of POCTs materialized in the implementation of the steps of the analytical processes (sample introduction, sample processing, visual or instrumental detection, and data processing) are outlined. Besides, analytical properties assigned to POCTs and to their quantitative and qualitative results are highlighted. Special emphasis is given to Quality Assurance and Quality Control procedures, which are essential aspects to achieving reliable results. Finally, decision making based on the results obtained with POCTs is discussed as are their benefits and drawbacks.

  19. Quo vadis, analytical chemistry?

    PubMed

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  20. Analytical Spectroscopy Using Modular Systems

    NASA Astrophysics Data System (ADS)

    Patterson, Brian M.; Danielson, Neil D.; Lorigan, Gary A.; Sommer, André J.

    2003-12-01

    This article describes the development of three analytical spectroscopy experiments that compare the determination of salicylic acid (SA) content in aspirin tablets. The experiments are based on UV vis, fluorescence, and Raman spectroscopies and utilize modular spectroscopic components. Students assemble their own instruments, optimize them with respect to signal-to-noise, generate calibration curves, determine the SA content in retail aspirin tablets, and assign features in the respective spectra to functional groups within the active material. Using this approach in the discovery-based setting, the students gain invaluable insight into method-specific parameters, such as instrumental components, sample preparation, and analytical capability. In addition, the students learn the fundamentals of fiber optics and signal processing using the low-cost CCD based spectroscopic components.

  1. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned

    PubMed Central

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population

  2. Visual Analytics 101

    SciTech Connect

    Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.

    2016-06-13

    This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics

  3. Analytical laboratory quality audits

    SciTech Connect

    Kelley, William D.

    2001-06-11

    Analytical Laboratory Quality Audits are designed to improve laboratory performance. The success of the audit, as for many activities, is based on adequate preparation, precise performance, well documented and insightful reporting, and productive follow-up. Adequate preparation starts with definition of the purpose, scope, and authority for the audit and the primary standards against which the laboratory quality program will be tested. The scope and technical processes involved lead to determining the needed audit team resources. Contact is made with the auditee and a formal audit plan is developed, approved and sent to the auditee laboratory management. Review of the auditee's quality manual, key procedures and historical information during preparation leads to better checklist development and more efficient and effective use of the limited time for data gathering during the audit itself. The audit begins with the opening meeting that sets the stage for the interactions between the audit team and the laboratory staff. Arrangements are worked out for the necessary interviews and examination of processes and records. The information developed during the audit is recorded on the checklists. Laboratory management is kept informed of issues during the audit so there are no surprises at the closing meeting. The audit report documents whether the management control systems are effective. In addition to findings of nonconformance, positive reinforcement of exemplary practices provides balance and fairness. Audit closure begins with receipt and evaluation of proposed corrective actions from the nonconformances identified in the audit report. After corrective actions are accepted, their implementation is verified. Upon closure of the corrective actions, the audit is officially closed.

  4. The Computer-Aided Analytic Process Model. Operations Handbook for the APM (Analytic Process Model) Demonstration Package. Appendix

    DTIC Science & Technology

    1986-01-01

    The Analytic Process Model for System Design and Measurement: A Computer-Aided Tool for Analyzing Training Systems and Other Human-Machine Systems. A...separate companion volume--The Computer-Aided Analytic Process Model : Operations Handbook for the APM Demonstration Package is also available under

  5. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  6. Liquid chromatography coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry and post-column addition of metal salt solutions as a powerful tool for the metabolic profiling of Fusarium oxysporum.

    PubMed

    Cirigliano, Adriana M; Rodriguez, M Alejandra; Gagliano, M Laura; Bertinetti, Brenda V; Godeas, Alicia M; Cabrera, Gabriela M

    2016-03-25

    Fusarium oxysporum L11 is a non-pathogenic soil-borne fungal strain that yielded an extract that showed antifungal activity against phytopathogens. In this study, reversed-phase high-performance liquid chromatography (RP-HPLC) coupled to different atmospheric pressure ionization sources-quadrupole-time-of-flight mass spectrometry (API-QTOF-MS) was applied for the comprehensive profiling of the metabolites from the extract. The employed sources were electrospray (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI). Post-column addition of metal solutions of Ca, Cu and Zn(II) was also tested using ESI. A total of 137 compounds were identified or tentatively identified by matching their accurate mass signals, suggested molecular formulae and MS/MS analysis with previously reported data. Some compounds were isolated and identified by NMR. The extract was rich in cyclic peptides like cyclosporins, diketopiperazines and sansalvamides, most of which were new, and are reported here for the first time. The use of post-column addition of metals resulted in a useful strategy for the discrimination of compound classes since specific adducts were observed for the different compound families. This technique also allowed the screening for compounds with metal binding properties. Thus, the applied methodology is a useful choice for the metabolic profiling of extracts and also for the selection of metabolites with potential biological activities related to interactions with metal ions.

  7. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  8. Ootw Tool Requirements in Relation to JWARS

    SciTech Connect

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  9. FT-Raman and chemometric tools for rapid determination of quality parameters in milk powder: Classification of samples for the presence of lactose and fraud detection by addition of maltodextrin.

    PubMed

    Rodrigues Júnior, Paulo Henrique; de Sá Oliveira, Kamila; de Almeida, Carlos Eduardo Rocha; De Oliveira, Luiz Fernando Cappa; Stephani, Rodrigo; Pinto, Michele da Silva; de Carvalho, Antônio Fernandes; Perrone, Ítalo Tuler

    2016-04-01

    FT-Raman spectroscopy has been explored as a quick screening method to evaluate the presence of lactose and identify milk powder samples adulterated with maltodextrin (2.5-50% w/w). Raman measurements can easily differentiate samples of milk powder, without the need for sample preparation, while traditional quality control methods, including high performance liquid chromatography, are cumbersome and slow. FT-Raman spectra were obtained from samples of whole lactose and low-lactose milk powder, both without and with addition of maltodextrin. Differences were observed between the spectra involved in identifying samples with low lactose content, as well as adulterated samples. Exploratory data analysis using Raman spectroscopy and multivariate analysis was also developed to classify samples with PCA and PLS-DA. The PLS-DA models obtained allowed to correctly classify all samples. These results demonstrate the utility of FT-Raman spectroscopy in combination with chemometrics to infer about the quality of milk powder.

  10. MassyTools: A High-Throughput Targeted Data Processing Tool for Relative Quantitation and Quality Control Developed for Glycomic and Glycoproteomic MALDI-MS.

    PubMed

    Jansen, Bas C; Reiding, Karli R; Bondt, Albert; Hipgrave Ederveen, Agnes L; Palmblad, Magnus; Falck, David; Wuhrer, Manfred

    2015-12-04

    The study of N-linked glycosylation has long been complicated by a lack of bioinformatics tools. In particular, there is still a lack of fast and robust data processing tools for targeted (relative) quantitation. We have developed modular, high-throughput data processing software, MassyTools, that is capable of calibrating spectra, extracting data, and performing quality control calculations based on a user-defined list of glycan or glycopeptide compositions. Typical examples of output include relative areas after background subtraction, isotopic pattern-based quality scores, spectral quality scores, and signal-to-noise ratios. We demonstrated MassyTools' performance on MALDI-TOF-MS glycan and glycopeptide data from different samples. MassyTools yielded better calibration than the commercial software flexAnalysis, generally showing 2-fold better ppm errors after internal calibration. Relative quantitation using MassyTools and flexAnalysis gave similar results, yielding a relative standard deviation (RSD) of the main glycan of ~6%. However, MassyTools yielded 2- to 5-fold lower RSD values for low-abundant analytes than flexAnalysis. Additionally, feature curation based on the computed quality criteria improved the data quality. In conclusion, we show that MassyTools is a robust automated data processing tool for high-throughput, high-performance glycosylation analysis. The package is released under the Apache 2.0 license and is freely available on GitHub ( https://github.com/Tarskin/MassyTools ).

  11. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  12. MRO Sequence Checking Tool

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    The MRO Sequence Checking Tool program, mro_check, automates significant portions of the MRO (Mars Reconnaissance Orbiter) sequence checking procedure. Though MRO has similar checks to the ODY s (Mars Odyssey) Mega Check tool, the checks needed for MRO are unique to the MRO spacecraft. The MRO sequence checking tool automates the majority of the sequence validation procedure and check lists that are used to validate the sequences generated by MRO MPST (mission planning and sequencing team). The tool performs more than 50 different checks on the sequence. The automation varies from summarizing data about the sequence needed for visual verification of the sequence, to performing automated checks on the sequence and providing a report for each step. To allow for the addition of new checks as needed, this tool is built in a modular fashion.

  13. Omics Tools

    SciTech Connect

    Schaumberg, Andrew

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not contain Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args

  14. Tool use by aquatic animals

    PubMed Central

    Mann, Janet; Patterson, Eric M.

    2013-01-01

    Tool-use research has focused primarily on land-based animals, with less consideration given to aquatic animals and the environmental challenges and conditions they face. Here, we review aquatic tool use and examine the contributing ecological, physiological, cognitive and social factors. Tool use among aquatic animals is rare but taxonomically diverse, occurring in fish, cephalopods, mammals, crabs, urchins and possibly gastropods. While additional research is required, the scarcity of tool use can likely be attributable to the characteristics of aquatic habitats, which are generally not conducive to tool use. Nonetheless, studying tool use by aquatic animals provides insights into the conditions that promote and inhibit tool-use behaviour across biomes. Like land-based tool users, aquatic animals tend to find tools on the substrate and use tools during foraging. However, unlike on land, tool users in water often use other animals (and their products) and water itself as a tool. Among sea otters and dolphins, the two aquatic tool users studied in greatest detail, some individuals specialize in tool use, which is vertically socially transmitted possibly because of their long dependency periods. In all, the contrasts between aquatic- and land-based tool users enlighten our understanding of the adaptive value of tool-use behaviour. PMID:24101631

  15. Developing Guidelines for Assessing Visual Analytics Environments

    SciTech Connect

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domains and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.

  16. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    SciTech Connect

    Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  17. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    SciTech Connect

    Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  18. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    SciTech Connect

    Steed, Chad A; SwanII, J. Edward; Fitzpatrick, Patrick J.; Jankun-Kelly, T.J.

    2013-01-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today s increasing complex, multivariate data sets. In this paper, a visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today s data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. This chapter provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  19. Jupiter Environment Tool

    NASA Technical Reports Server (NTRS)

    Sturm, Erick J.; Monahue, Kenneth M.; Biehl, James P.; Kokorowski, Michael; Ngalande, Cedrick,; Boedeker, Jordan

    2012-01-01

    The Jupiter Environment Tool (JET) is a custom UI plug-in for STK that provides an interface to Jupiter environment models for visualization and analysis. Users can visualize the different magnetic field models of Jupiter through various rendering methods, which are fully integrated within STK s 3D Window. This allows users to take snapshots and make animations of their scenarios with magnetic field visualizations. Analytical data can be accessed in the form of custom vectors. Given these custom vectors, users have access to magnetic field data in custom reports, graphs, access constraints, coverage analysis, and anywhere else vectors are used within STK.

  20. Aquatic concentrations of chemical analytes compared to ...

    EPA Pesticide Factsheets

    We describe screening level estimates of potential aquatic toxicity posed by 227 chemical analytes that were measured in 25 ambient water samples collected as part of a joint USGS/USEPA drinking water plant study. Measured concentrations were compared to biological effect concentration (EC) estimates, including USEPA aquatic life criteria, effective plasma concentrations of pharmaceuticals, published toxicity data summarized in the USEPA ECOTOX database, and chemical structure-based predictions. Potential dietary exposures were estimated using a generic 3-tiered food web accumulation scenario. For many analytes, few or no measured effect data were found, and for some analytes, reporting limits exceeded EC estimates, limiting the scope of conclusions. Results suggest occasional occurrence above ECs for copper, aluminum, strontium, lead, uranium, and nitrate. Sparse effect data for manganese, antimony, and vanadium suggest that these analytes may occur above ECs, but additional effect data would be desirable to corroborate EC estimates. These conclusions were not affected by bioaccumulation estimates. No organic analyte concentrations were found to exceed EC estimates, but ten analytes had concentrations in excess of 1/10th of their respective EC: triclocarban, norverapamil, progesterone, atrazine, metolachlor, triclosan, para-nonylphenol, ibuprofen, venlafaxine, and amitriptyline, suggesting more detailed characterization of these analytes. Purpose: to provide sc

  1. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  2. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  3. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  4. Mission Stream Analysis - Delta Analytic Model. Revision

    DTIC Science & Technology

    2014-09-01

    demonstrating mission effectiveness. The second tool is the  ( Delta ) Analytic Model, which provides an approach for identifying disparate...requirements into a system’s technical performance and operator workload requirements; and help minimize the “ delta ” between domains across the system’s...mission and system capability requirements into a system’s technical performance and operator workload requirements; and help minimize the “ delta

  5. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) USER MANUAL

    EPA Science Inventory

    ATtlLA is an ArcView extension that allows users to easily calculate many common landscape metrics. GIS expertise is not required, but some experience with ArcView is recommended. Four metric groups are currently included in ATtILA: landscape characteristics, riparian characteris...

  6. Immunoassay as an analytical tool in agricultural biotechnology.

    PubMed

    Grothaus, G David; Bandla, Murali; Currier, Thomas; Giroux, Randal; Jenkins, G Ronald; Lipp, Markus; Shan, Guomin; Stave, James W; Pantella, Virginia

    2006-01-01

    Immunoassays for biotechnology engineered proteins are used by AgBiotech companies at numerous points in product development and by feed and food suppliers for compliance and contractual purposes. Although AgBiotech companies use the technology during product development and seed production, other stakeholders from the food and feed supply chains, such as commodity, food, and feed companies, as well as third-party diagnostic testing companies, also rely on immunoassays for a number of purposes. The primary use of immunoassays is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of GM analysis using immunoassays and especially its application to the testing of grains. The 2 most commonly used formats are lateral flow devices (LFD) and plate-based enzyme-linked immunosorbent assays (ELISA). The main applications of both formats are discussed in general, and the benefits and drawbacks are discussed in detail. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effects they may have on the accuracy of the immunoassays.

  7. GreenSCOR: Developing a Green Supply Chain Analytical Tool

    DTIC Science & Technology

    2003-03-01

    An emerging area in supply chain practice is green supply chain management, which integrates environmental management with traditional supply chain management...GreenSCOR is the solution to closing this gap. GreenSCOR is a modification of version 5.0 of the Supply Chain Operations Reference (SCOR...model developed by the Supply - Chain Council (SCC). LMI used SCOR as a foundation because it has been proven over several years of continual development

  8. Analytical Tools for Investigating and Modeling Agent-Based Systems

    DTIC Science & Technology

    2005-06-01

    Edward Witten 4 14 18716 Juan M. Maldacena 2 6 8076 Steven S. Gubser 2 4 5067 Igor R. Klebanov 1 4 5843 Leonard Susskind 1 4 5526 Joseph Polchinski 1...Witten is a MacArthur Foundation fellow, a Fields medalist, and a Dirac fellow. Juan Maldacena , also a MacArthur Foundation fellow, is a younger... Maldacena 7334 39 Cumrun Vafa 6578 55 Nathan Seiberg 6258 45 Andrew Strominger 5371 44 Michael R. Douglas 5089 24 Igor R. Klebanov 5063 51 Joseph Polchinski

  9. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  10. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  11. Giardia/giardiasis - a perspective on diagnostic and analytical tools.

    PubMed

    Koehler, Anson V; Jex, Aaron R; Haydon, Shane R; Stevens, Melita A; Gasser, Robin B

    2014-01-01

    Giardiasis is a gastrointestinal disease of humans and other animals caused by species of parasitic protists of the genus Giardia. This disease is transmitted mainly via the faecal-oral route (e.g., in water or food) and is of socioeconomic importance worldwide. The accurate detection and genetic characterisation of the different species and population variants (usually referred to as assemblages and/or sub-assemblages) of Giardia are central to understanding their transmission patterns and host spectra. The present article provides a background on Giardia and giardiasis, and reviews some key techniques employed for the identification and genetic characterisation of Giardia in biological samples, the diagnosis of infection and the analysis of genetic variation within and among species of Giardia. Advances in molecular techniques provide a solid basis for investigating the systematics, population genetics, ecology and epidemiology of Giardia species and genotypes as well as the prevention and control of giardiasis.

  12. Monitoring the analytic surface.

    PubMed

    Spence, D P; Mayes, L C; Dahl, H

    1994-01-01

    How do we listen during an analytic hour? Systematic analysis of the speech patterns of one patient (Mrs. C.) strongly suggests that the clustering of shared pronouns (e.g., you/me) represents an important aspect of the analytic surface, preconsciously sensed by the analyst and used by him to determine when to intervene. Sensitivity to these patterns increases over the course of treatment, and in a final block of 10 hours shows a striking degree of contingent responsivity: specific utterances by the patient are consistently echoed by the analyst's interventions.

  13. Frontiers in analytical chemistry

    SciTech Connect

    Amato, I.

    1988-12-15

    Doing more with less was the modus operandi of R. Buckminster Fuller, the late science genius, and inventor of such things as the geodesic dome. In late September, chemists described their own version of this maxim--learning more chemistry from less material and in less time--in a symposium titled Frontiers in Analytical Chemistry at the 196th National Meeting of the American Chemical Society in Los Angeles. Symposium organizer Allen J. Bard of the University of Texas at Austin assembled six speakers, himself among them, to survey pretty widely different areas of analytical chemistry.

  14. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  15. Freeform surface of progressive addition lens represented by Zernike polynomials

    NASA Astrophysics Data System (ADS)

    Li, Yiyu; Xia, Risheng; Chen, Jiaojie; Feng, Haihua; Yuan, Yimin; Zhu, Dexi; Li, Chaohong

    2016-10-01

    We used the explicit expression of Zernike polynomials in Cartesian coordinates to fit and describe the freeform surface of progressive addition lens (PAL). The derivatives of Zernike polynomials can easily be calculated from the explicit expression and used to calculate the principal curvatures of freeform surface based on differential geometry. The surface spherical power and surface astigmatism of the freeform surface were successfully derived from the principal curvatures. By comparing with the traditional analytical method, Zernike polynomials with order of 20 is sufficient to represent the freeform surface with nanometer accuracy if dense sampling of the original surface is achieved. Therefore, the data files which contain the massive sampling points of the freeform surface for the generation of the trajectory of diamond tool tip required by diamond machine for PAL manufacture can be simplified by using a few Zernike coefficients.

  16. Summary of NDE of Additive Manufacturing Efforts in NASA

    NASA Technical Reports Server (NTRS)

    Waller, Jess; Saulsberry, Regor; Parker, Bradford; Hodges, Kenneth; Burke, Eric; Taminger, Karen

    2014-01-01

    (1) General Rationale for Additive Manufacturing (AM): (a) Operate under a 'design-to-constraint' paradigm, make parts too complicated to fabricate otherwise, (b) Reduce weight by 20 percent with monolithic parts, (c) Reduce waste (green manufacturing), (e) Eliminate reliance on Original Equipment Manufacturers for critical spares, and (f) Extend life of in-service parts by innovative repair methods; (2) NASA OSMA NDE of AM State-of-the-Discipline Report; (3) Overview of NASA AM Efforts at Various Centers: (a) Analytical Tools, (b) Ground-Based Fabrication (c) Space-Based Fabrication; and (d) Center Activity Summaries; (4) Overview of NASA NDE data to date on AM parts; and (5) Gap Analysis/Recommendations for NDE of AM.

  17. Analytical Services Management System

    SciTech Connect

    Church, Shane; Nigbor, Mike; Hillman, Daniel

    2005-03-30

    Analytical Services Management System (ASMS) provides sample management services. Sample management includes sample planning for analytical requests, sample tracking for shipping and receiving by the laboratory, receipt of the analytical data deliverable, processing the deliverable and payment of the laboratory conducting the analyses. ASMS is a web based application that provides the ability to manage these activities at multiple locations for different customers. ASMS provides for the assignment of single to multiple samples for standard chemical and radiochemical analyses. ASMS is a flexible system which allows the users to request analyses by line item code. Line item codes are selected based on the Basic Ordering Agreement (BOA) format for contracting with participating laboratories. ASMS also allows contracting with non-BOA laboratories using a similar line item code contracting format for their services. ASMS allows sample and analysis tracking from sample planning and collection in the field through sample shipment, laboratory sample receipt, laboratory analysis and submittal of the requested analyses, electronic data transfer, and payment of the laboratories for the completed analyses. The software when in operation contains business sensitive material that is used as a principal portion of the Kaiser Analytical Management Services business model. The software version provided is the most recent version, however the copy of the application does not contain business sensitive data from the associated Oracle tables such as contract information or price per line item code.

  18. Analytics: Changing the Conversation

    ERIC Educational Resources Information Center

    Oblinger, Diana G.

    2013-01-01

    In this third and concluding discussion on analytics, the author notes that we live in an information culture. We are accustomed to having information instantly available and accessible, along with feedback and recommendations. We want to know what people think and like (or dislike). We want to know how we compare with "others like me."…

  19. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  20. Social Learning Analytics

    ERIC Educational Resources Information Center

    Buckingham Shum, Simon; Ferguson, Rebecca

    2012-01-01

    We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers.…

  1. Predictive Data Tools Find Uses in Schools

    ERIC Educational Resources Information Center

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  2. Tools for Educational Data Mining: A Review

    ERIC Educational Resources Information Center

    Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan

    2017-01-01

    In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…

  3. Tool to Prioritize Energy Efficiency Investments

    SciTech Connect

    Farese, Philip; Gelman, Rachel; Hendron, Robert

    2012-08-01

    To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.

  4. The functional connectivity of the human caudate: an application of meta-analytic connectivity modeling with behavioral filtering.

    PubMed

    Robinson, Jennifer L; Laird, Angela R; Glahn, David C; Blangero, John; Sanghera, Manjit K; Pessoa, Luiz; Fox, P Mickle; Uecker, Angela; Friehs, Gerhard; Young, Keith A; Griffin, Jennifer L; Lovallo, William R; Fox, Peter T

    2012-03-01

    Meta-analysis based techniques are emerging as powerful, robust tools for developing models of connectivity in functional neuroimaging. Here, we apply meta-analytic connectivity modeling to the human caudate to 1) develop a model of functional connectivity, 2) determine if meta-analytic methods are sufficiently sensitive to detect behavioral domain specificity within region-specific functional connectivity networks, and 3) compare meta-analytic driven segmentation to structural connectivity parcellation using diffusion tensor imaging. Results demonstrate strong coherence between meta-analytic and data-driven methods. Specifically, we found that behavioral filtering resulted in cognition and emotion related structures and networks primarily localized to the head of the caudate nucleus, while perceptual and action specific regions localized to the body of the caudate, consistent with early models of nonhuman primate histological studies and postmortem studies in humans. Diffusion tensor imaging (DTI) revealed support for meta-analytic connectivity modeling's (MACM) utility in identifying both direct and indirect connectivity. Our results provide further validation of meta-analytic connectivity modeling, while also highlighting an additional potential, namely the extraction of behavioral domain specific functional connectivity.

  5. PV Hourly Simulation Tool

    SciTech Connect

    Dean, Jesse; Metzger, Ian

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes the option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  6. Imaging and Analytics: The changing face of Medical Imaging

    NASA Astrophysics Data System (ADS)

    Foo, Thomas

    There have been significant technological advances in imaging capability over the past 40 years. Medical imaging capabilities have developed rapidly, along with technology development in computational processing speed and miniaturization. Moving to all-digital, the number of images that are acquired in a routine clinical examination has increased dramatically from under 50 images in the early days of CT and MRI to more than 500-1000 images today. The staggering number of images that are routinely acquired poses significant challenges for clinicians to interpret the data and to correctly identify the clinical problem. Although the time provided to render a clinical finding has not substantially changed, the amount of data available for interpretation has grown exponentially. In addition, the image quality (spatial resolution) and information content (physiologically-dependent image contrast) has also increased significantly with advances in medical imaging technology. On its current trajectory, medical imaging in the traditional sense is unsustainable. To assist in filtering and extracting the most relevant data elements from medical imaging, image analytics will have a much larger role. Automated image segmentation, generation of parametric image maps, and clinical decision support tools will be needed and developed apace to allow the clinician to manage, extract and utilize only the information that will help improve diagnostic accuracy and sensitivity. As medical imaging devices continue to improve in spatial resolution, functional and anatomical information content, image/data analytics will be more ubiquitous and integral to medical imaging capability.

  7. TLD efficiency calculations for heavy ions: an analytical approach

    SciTech Connect

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; La Tessa, Chiara; Berger, Thomas; Durante, Marco; Rosso, Valeria; Krämer, Michael

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared with experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.

  8. TLD efficiency calculations for heavy ions: an analytical approach

    DOE PAGES

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; ...

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared withmore » experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.« less

  9. Robot Tools

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Mecanotron, now division of Robotics and Automation Corporation, developed a quick-change welding method called the Automatic Robotics Tool-change System (ARTS) under Marshall Space Flight Center and Rockwell International contracts. The ARTS system has six tool positions ranging from coarse sanding disks and abrasive wheels to cloth polishing wheels with motors of various horsepower. The system is used by fabricators of plastic body parts for the auto industry, by Texas Instruments for making radar domes, and for advanced composites at Aerospatiale in France.

  10. Management Tools

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  11. Analytic Measures for Evaluating Managerial Writing.

    ERIC Educational Resources Information Center

    Rogers, Priscilla S.

    1994-01-01

    Describes the addition of a writing performance assessment to the Graduate Management Admission Test (GMAT), and how development of the Analysis of Argument measure and the Persuasive Adaptiveness measure helps explain the holistic writing score given during grading of the GMAT. Correlates holistic and analytic scores, revealing a positive…

  12. Requirements for Predictive Analytics

    SciTech Connect

    Troy Hiltbrand

    2012-03-01

    It is important to have a clear understanding of how traditional Business Intelligence (BI) and analytics are different and how they fit together in optimizing organizational decision making. With tradition BI, activities are focused primarily on providing context to enhance a known set of information through aggregation, data cleansing and delivery mechanisms. As these organizations mature their BI ecosystems, they achieve a clearer picture of the key performance indicators signaling the relative health of their operations. Organizations that embark on activities surrounding predictive analytics and data mining go beyond simply presenting the data in a manner that will allow decisions makers to have a complete context around the information. These organizations generate models based on known information and then apply other organizational data against these models to reveal unknown information.

  13. Multifunctional nanoparticles: analytical prospects.

    PubMed

    de Dios, Alejandro Simón; Díaz-García, Marta Elena

    2010-05-07

    Multifunctional nanoparticles are among the most exciting nanomaterials with promising applications in analytical chemistry. These applications include (bio)sensing, (bio)assays, catalysis and separations. Although most of these applications are based on the magnetic, optical and electrochemical properties of multifunctional nanoparticles, other aspects such as the synergistic effect of the functional groups and the amplification effect associated with the nanoscale dimension have also been observed. Considering not only the nature of the raw material but also the shape, there is a huge variety of nanoparticles. In this review only magnetic, quantum dots, gold nanoparticles, carbon and inorganic nanotubes as well as silica, titania and gadolinium oxide nanoparticles are addressed. This review presents a narrative summary on the use of multifunctional nanoparticles for analytical applications, along with a discussion on some critical challenges existing in the field and possible solutions that have been or are being developed to overcome these challenges.

  14. Avatars in Analytical Gaming

    SciTech Connect

    Cowell, Andrew J.; Cowell, Amanda K.

    2009-08-29

    This paper discusses the design and use of anthropomorphic computer characters as nonplayer characters (NPC’s) within analytical games. These new environments allow avatars to play a central role in supporting training and education goals instead of planning the supporting cast role. This new ‘science’ of gaming, driven by high-powered but inexpensive computers, dedicated graphics processors and realistic game engines, enables game developers to create learning and training opportunities on par with expensive real-world training scenarios. However, there needs to be care and attention placed on how avatars are represented and thus perceived. A taxonomy of non-verbal behavior is presented and its application to analytical gaming discussed.

  15. Nuclear analytical chemistry

    SciTech Connect

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  16. Ultrasound in analytical chemistry.

    PubMed

    Priego Capote, F; Luque de Castro, M D

    2007-01-01

    Ultrasound is a type of energy which can help analytical chemists in almost all their laboratory tasks, from cleaning to detection. A generic view of the different steps which can be assisted by ultrasound is given here. These steps include preliminary operations usually not considered in most analytical methods (e.g. cleaning, degassing, and atomization), sample preparation being the main area of application. In sample preparation ultrasound is used to assist solid-sample treatment (e.g. digestion, leaching, slurry formation) and liquid-sample preparation (e.g. liquid-liquid extraction, emulsification, homogenization) or to promote heterogeneous sample treatment (e.g. filtration, aggregation, dissolution of solids, crystallization, precipitation, defoaming, degassing). Detection techniques based on use of ultrasonic radiation, the principles on which they are based, responses, and the quantities measured are also discussed.

  17. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    SciTech Connect

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David; Wolverton, Michael J.; Bruce, Joseph R.; Burtner, Edwin R.; Endert, Alexander

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying models of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.

  18. Analytic Modeling of Insurgencies

    DTIC Science & Technology

    2014-08-01

    influenced by interests and utilities. 4.1 Carrots and Sticks An analytic model that captures the aforementioned utilitarian aspect is presented in... carrots ” x. A dynamic utility-based model is developed in [26] in which the state variables are the fractions of contrarians (supporters of the...Unanticipated Political Revolution," Public Choice, vol. 61, pp. 41-74, 1989. [26] M. P. Atkinson, M. Kress and R. Szechtman, " Carrots , Sticks and Fog

  19. Industrial Analytics Corporation

    SciTech Connect

    Industrial Analytics Corporation

    2004-01-30

    The lost foam casting process is sensitive to the properties of the EPS patterns used for the casting operation. In this project Industrial Analytics Corporation (IAC) has developed a new low voltage x-ray instrument for x-ray radiography of very low mass EPS patterns. IAC has also developed a transmitted visible light method for characterizing the properties of EPS patterns. The systems developed are also applicable to other low density materials including graphite foams.

  20. Integrated Array/Metadata Analytics

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  1. Geometric reasoning about assembly tools

    SciTech Connect

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  2. Transcutaneous Measurement of Blood Analyte Concentration Using Raman Spectroscopy

    NASA Astrophysics Data System (ADS)

    Barman, Ishan; Singh, Gajendra P.; Dasari, Ramachandra R.; Feld, Michael S.

    2008-11-01

    Diabetes mellitus is a chronic disorder, affecting nearly 200 million people worldwide. Acute complications, such as hypoglycemia, cardiovascular disease and retinal damage, may occur if the disease is not adequately controlled. As diabetes has no known cure, tight control of glucose levels is critical for the prevention of such complications. Given the necessity for regular monitoring of blood glucose, development of non-invasive glucose detection devices is essential to improve the quality of life in diabetic patients. The commercially available glucose sensors measure the interstitial fluid glucose by electrochemical detection. However, these sensors have severe limitations, primarily related to their invasive nature and lack of stability. This necessitates the development of a truly non-invasive glucose detection technique. NIR Raman Spectroscopy, which combines the substantial penetration depth of NIR light with the excellent chemical specificity of Raman spectroscopy, provides an excellent tool to meet the challenges involved. Additionally, it enables simultaneous determination of multiple blood analytes. Our laboratory has pioneered the use of Raman spectroscopy for blood analytes' detection in biological media. The preliminary success of our non-invasive glucose measurements both in vitro (such as in serum and blood) and in vivo has provided the foundation for the development of feasible clinical systems. However, successful application of this technology still faces a few hurdles, highlighted by the problems of tissue luminescence and selection of appropriate reference concentration. In this article we explore possible avenues to overcome these challenges so that prospective prediction accuracy of blood analytes can be brought to clinically acceptable levels.

  3. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils.

  4. Ternary complexes in analytical chemistry.

    PubMed

    Babko, A K

    1968-08-01

    Reactions between a complex AB and a third component C do not always proceed by a displacement mechanism governed by the energy difference of the chemical bonds A-B and A-C. The third component often becomes part of the complex, forming a mixed co-ordination sphere or ternary complex. The properties of this ternary complex ABC are not additive functions of the properties of AB and AC. Such reactions are important in many methods in analytical chemistry, particularly in photometric analysis, extractive separation, masking, etc. The general properties of the four basic types of ternary complex are reviewed and examples given. The four types comprise the systems (a) metal ion, electronegative ligand, organic base, (b) one metal ion, two different electronegative ligands, (c) ternary heteropoly acids, and (d) two different metal ions, one ligand.

  5. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    SciTech Connect

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  6. Functional Interfaces Constructed by Controlled/Living Radical Polymerization for Analytical Chemistry.

    PubMed

    Wang, Huai-Song; Song, Min; Hang, Tai-Jun

    2016-02-10

    The high-value applications of functional polymers in analytical science generally require well-defined interfaces, including precisely synthesized molecular architectures and compositions. Controlled/living radical polymerization (CRP) has been developed as a versatile and powerful tool for the preparation of polymers with narrow molecular weight distributions and predetermined molecular weights. Among the CRP system, atom transfer radical polymerization (ATRP) and reversible addition-fragmentation chain transfer (RAFT) are well-used to develop new materials for analytical science, such as surface-modified core-shell particles, monoliths, MIP micro- or nanospheres, fluorescent nanoparticles, and multifunctional materials. In this review, we summarize the emerging functional interfaces constructed by RAFT and ATRP for applications in analytical science. Various polymers with precisely controlled architectures including homopolymers, block copolymers, molecular imprinted copolymers, and grafted copolymers were synthesized by CRP methods for molecular separation, retention, or sensing. We expect that the CRP methods will become the most popular technique for preparing functional polymers that can be broadly applied in analytical chemistry.

  7. Analytic Approximation to Randomly Oriented Spheroid Extinction

    DTIC Science & Technology

    1993-12-01

    104 times faster than by the T - matrix code . Since the T-matrix scales as at least the cube of the optical size whereas the analytic approximation is...coefficient estimate, and with the Rayleigh formula. Since it is difficult estimate the accuracy near the limit of stability of the T - matrix code some...additional error due to the T - matrix code could be present. UNCLASSIFIED 30 Max Ret Error, Analytic vs T-Mat, r= 1/5 0.0 20 25 10 ~ 0.5 100 . 7.5 S-1.0

  8. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  9. Forecasting Hotspots-A Predictive Analytics Approach.

    PubMed

    Maciejewski, R; Hafen, R; Rudolph, S; Larew, S G; Mitchell, M A; Cleveland, W S; Ebert, D S

    2011-04-01

    Current visual analytics systems provide users with the means to explore trends in their data. Linked views and interactive displays provide insight into correlations among people, events, and places in space and time. Analysts search for events of interest through statistical tools linked to visual displays, drill down into the data, and form hypotheses based upon the available information. However, current systems stop short of predicting events. In spatiotemporal data, analysts are searching for regions of space and time with unusually high incidences of events (hotspots). In the cases where hotspots are found, analysts would like to predict how these regions may grow in order to plan resource allocation and preventative measures. Furthermore, analysts would also like to predict where future hotspots may occur. To facilitate such forecasting, we have created a predictive visual analytics toolkit that provides analysts with linked spatiotemporal and statistical analytic views. Our system models spatiotemporal events through the combination of kernel density estimation for event distribution and seasonal trend decomposition by loess smoothing for temporal predictions. We provide analysts with estimates of error in our modeling, along with spatial and temporal alerts to indicate the occurrence of statistically significant hotspots. Spatial data are distributed based on a modeling of previous event locations, thereby maintaining a temporal coherence with past events. Such tools allow analysts to perform real-time hypothesis testing, plan intervention strategies, and allocate resources to correspond to perceived threats.

  10. Downhole tool

    DOEpatents

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  11. The GNEMRE Dendro Tool.

    SciTech Connect

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  12. CMS tracker visualization tools

    NASA Astrophysics Data System (ADS)

    Mennea, M. S.; Osborne, I.; Regano, A.; Zito, G.

    2005-08-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  13. Tools to assess tissue quality.

    PubMed

    Neumeister, Veronique M

    2014-03-01

    Biospecimen science has recognized the importance of tissue quality for accurate molecular and biomarker analysis and efforts are made to standardize tissue procurement, processing and storage conditions of tissue samples. At the same time the field has emphasized the lack of standardization of processes between different laboratories, the variability inherent in the analytical phase and the lack of control over the pre-analytical phase of tissue processing. The problem extends back into tissue samples in biorepositories, which are often decades old and where documentation about tissue processing might not be available. This review highlights pre-analytical variations in tissue handling, processing, fixation and storage and emphasizes the effects of these variables on nucleic acids and proteins in harvested tissue. Finally current tools for quality control regarding molecular or biomarker analysis are summarized and discussed.

  14. MaterialVis: material visualization tool using direct volume and surface rendering techniques.

    PubMed

    Okuyan, Erhan; Güdükbay, Uğur; Bulutay, Ceyhun; Heinig, Karl-Heinz

    2014-05-01

    Visualization of the materials is an indispensable part of their structural analysis. We developed a visualization tool for amorphous as well as crystalline structures, called MaterialVis. Unlike the existing tools, MaterialVis represents material structures as a volume and a surface manifold, in addition to plain atomic coordinates. Both amorphous and crystalline structures exhibit topological features as well as various defects. MaterialVis provides a wide range of functionality to visualize such topological structures and crystal defects interactively. Direct volume rendering techniques are used to visualize the volumetric features of materials, such as crystal defects, which are responsible for the distinct fingerprints of a specific sample. In addition, the tool provides surface visualization to extract hidden topological features within the material. Together with the rich set of parameters and options to control the visualization, MaterialVis allows users to visualize various aspects of materials very efficiently as generated by modern analytical techniques such as the Atom Probe Tomography.

  15. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  16. Risk analytics for hedge funds

    NASA Astrophysics Data System (ADS)

    Pareek, Ankur

    2005-05-01

    The rapid growth of the hedge fund industry presents significant business opportunity for the institutional investors particularly in the form of portfolio diversification. To facilitate this, there is a need to develop a new set of risk analytics for investments consisting of hedge funds, with the ultimate aim to create transparency in risk measurement without compromising the proprietary investment strategies of hedge funds. As well documented in the literature, use of dynamic options like strategies by most of the hedge funds make their returns highly non-normal with fat tails and high kurtosis, thus rendering Value at Risk (VaR) and other mean-variance analysis methods unsuitable for hedge fund risk quantification. This paper looks at some unique concerns for hedge fund risk management and will particularly concentrate on two approaches from physical world to model the non-linearities and dynamic correlations in hedge fund portfolio returns: Self Organizing Criticality (SOC) and Random Matrix Theory (RMT).Random Matrix Theory analyzes correlation matrix between different hedge fund styles and filters random noise from genuine correlations arising from interactions within the system. As seen in the results of portfolio risk analysis, it leads to a better portfolio risk forecastability and thus to optimum allocation of resources to different hedge fund styles. The results also prove the efficacy of self-organized criticality and implied portfolio correlation as a tool for risk management and style selection for portfolios of hedge funds, being particularly effective during non-linear market crashes.

  17. Infrared Spectroscopy as a Chemical Fingerprinting Tool

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    2003-01-01

    Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. Any sample material that will interact with infrared light produces a spectrum and, although normally associated with organic materials, inorganic compounds may also be infrared active. The technique is rapid, reproducible and usually non-invasive to the sample. That it is non-invasive allows for additional characterization of the original material using other analytical techniques including thermal analysis and RAMAN spectroscopic techniques. With the appropriate accessories, the technique can be used to examine samples in liquid, solid or gas phase. Both aqueous and non-aqueous free-flowing solutions can be analyzed, as can viscous liquids such as heavy oils and greases. Solid samples of varying sizes and shapes may also be examined and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be analyzed. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.

  18. Bolstering Teaching through Online Tools

    ERIC Educational Resources Information Center

    Singh, Anil; Mangalaraj, George; Taneja, Aakash

    2010-01-01

    This paper offers a compilation of technologies that provides either free or low-cost solutions to the challenges of teaching online courses. It presents various teaching methods the outlined tools and technologies can support, with emphasis on fit between these tools and the tasks they are meant to serve. In addition, it highlights various…

  19. Analytical and Computational Aspects of Collaborative Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    Bilevel problem formulations have received considerable attention as an approach to multidisciplinary optimization in engineering. We examine the analytical and computational properties of one such approach, collaborative optimization. The resulting system-level optimization problems suffer from inherent computational difficulties due to the bilevel nature of the method. Most notably, it is impossible to characterize and hence identify solutions of the system-level problems because the standard first-order conditions for solutions of constrained optimization problems do not hold. The analytical features of the system-level problem make it difficult to apply conventional nonlinear programming algorithms. Simple examples illustrate the analysis and the algorithmic consequences for optimization methods. We conclude with additional observations on the practical implications of the analytical and computational properties of collaborative optimization.

  20. Learning Analytics: Readiness and Rewards

    ERIC Educational Resources Information Center

    Friesen, Norm

    2013-01-01

    This position paper introduces the relatively new field of learning analytics, first by considering the relevant meanings of both "learning" and "analytics," and then by looking at two main levels at which learning analytics can be or has been implemented in educational organizations. Although integrated turnkey systems or…

  1. The analytic renormalization group

    NASA Astrophysics Data System (ADS)

    Ferrari, Frank

    2016-08-01

    Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k ∈ Z, associated with the Matsubara frequencies νk = 2 πk / β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct "Analytic Renormalization Group" linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk | < μ (with the possible exception of the zero mode G0), together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk | ≥ μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  2. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and

  3. Big Data Analytics in Chemical Engineering.

    PubMed

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-02-27

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation. Expected final online publication date for the Annual Review of Chemical and Biomolecular Engineering Volume 8 is June 7, 2017. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  4. Applications of surface analytical techniques in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Qian, Gujie; Li, Yubiao; Gerson, Andrea R.

    2015-03-01

    This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences

  5. Tool Gear: Infrastructure for Parallel Tools

    SciTech Connect

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  6. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    SciTech Connect

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  7. Analytic pion form factor

    NASA Astrophysics Data System (ADS)

    Lomon, Earle L.; Pacetti, Simone

    2016-09-01

    The pion electromagnetic form factor and two-pion production in electron-positron collisions are simultaneously fitted by a vector dominance model evolving to perturbative QCD at large momentum transfer. This model was previously successful in simultaneously fitting the nucleon electromagnetic form factors (spacelike region) and the electromagnetic production of nucleon-antinucleon pairs (timelike region). For this pion case dispersion relations are used to produce the analytic connection of the spacelike and timelike regions. The fit to all the data is good, especially for the newer sets of timelike data. The description of high-q2 data, in the timelike region, requires one more meson with ρ quantum numbers than listed in the 2014 Particle Data Group review.

  8. VERDE Analytic Modules

    SciTech Connect

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates served within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercial firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.

  9. Normality in Analytical Psychology

    PubMed Central

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  10. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> ; KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  11. Analytical Chemistry Laboratory progress report for FY 1989

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1989-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  12. ICDA: a platform for Intelligent Care Delivery Analytics.

    PubMed

    Gotz, David; Stavropoulos, Harry; Sun, Jimeng; Wang, Fei

    2012-01-01

    The identification of high-risk patients is a critical component in improving patient outcomes and managing costs. This paper describes the Intelligent Care Delivery Analytics platform (ICDA), a system which enables risk assessment analytics that process large collections of dynamic electronic medical data to identify at-risk patients. ICDA works by ingesting large volumes of data into a common data model, then orchestrating a collection of analytics that identify at-risk patients. It also provides an interactive environment through which users can access and review the analytics results. In addition, ICDA provides APIs via which analytics results can be retrieved to surface in external applications. A detailed review of ICDA's architecture is provided. Descriptions of four use cases are included to illustrate ICDA's application within two different data environments. These use cases showcase the system's flexibility and exemplify the types of analytics it enables.

  13. Analytical Chemistry Laboratory progress report for FY 1991

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Boparai, A.S.

    1991-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  14. [Analytical epidemiology of urolithiasis].

    PubMed

    Kodama, H; Ohno, Y

    1989-06-01

    In this paper, urolithiasis is reviewed from the standpoint of analytical epidemiology, which examines a statistical association between a given disease and a hypothesized factor with an aim of inferring its causality. Factors incriminated epidemiologically for stone formation include age, sex, occupation, social class (level of affluence), season of the year and climate, dietary and fluid intake and genetic prodisposition. Since some of these factors are interlinked, they are broadly classified into five categories and epidemiologically looked over here. Genetic predisposition is essentially endorsed by the more frequent episodes of stone formation in the family members of stone formers, as compared to non-stone formers. Nevertheless, some environmental factors (likely to be dietary habits) shared by family members are believed to be relatively more important than genetic predisposition. A hot, sunny climate may influence stone formation through inducing dehydration with increased perspiration and increased solute concentration with decreased urine volume, coupled with inadequate liquid intake, and possibly through the greater exposure to ultraviolet radiation which eventually results in an increased vitamin D production, conceivably correlated with seasonal variation in calcium and oxalate excretion to the urine. Urinary tract infections are importantly involved in the formation of magnesium ammonium phosphate stones in particular. The association with regional water hardness is still in controversy. Excessive intake of coffee, tea and alcoholic beverages seemingly increase the risk of renal calculi, though not consistently confirmed. Many dietary elements have been suggested by numerous clinical and experimental investigations, but a few elements are substantiated by analytical epidemiological investigations. An increased ingestion of animal protein and sugar and a decreased ingestion of dietary fiber and green-yellow vegetables are linked with the higher

  15. Can orangutans (Pongo abelii) infer tool functionality?

    PubMed

    Mulcahy, Nicholas J; Schubiger, Michèle N

    2014-05-01

    It is debatable whether apes can reason about the unobservable properties of tools. We tested orangutans for this ability with a range of tool tasks that they could solve by using observational cues to infer tool functionality. In experiment 1, subjects successfully chose an unbroken tool over a broken one when each tool's middle section was hidden. This prevented seeing which tool was functional but it could be inferred by noting the tools' visible ends that were either disjointed (broken tool) or aligned (unbroken tool). We investigated whether success in experiment 1 was best explained by inferential reasoning or by having a preference per se for a hidden tool with an aligned configuration. We conducted a similar task to experiment 1 and included a functional bent tool that could be arranged to have the same disjointed configuration as the broken tool. The results suggested that subjects had a preference per se for the aligned tool by choosing it regardless of whether it was paired with the broken tool or the functional bent tool. However, further experiments with the bent tool task suggested this preference was a result of additional demands of having to attend to and remember the properties of the tools from the beginning of the task. In our last experiment, we removed these task demands and found evidence that subjects could infer the functionality of a broken tool and an unbroken tool that both looked identical at the time of choice.

  16. Additive Similarity Trees

    ERIC Educational Resources Information Center

    Sattath, Shmuel; Tversky, Amos

    1977-01-01

    Tree representations of similarity data are investigated. Hierarchical clustering is critically examined, and a more general procedure, called the additive tree, is presented. The additive tree representation is then compared to multidimensional scaling. (Author/JKS)

  17. Indispensable tool

    SciTech Connect

    Robinson, Arthur

    2001-08-10

    Synchrotron radiation has become an indispensable research tool for a growing number of scientists in a seemingly ever expanding number of disciplines. We can thank the European Synchrotron Research Facility (ESRF) in Grenoble for taking an innovative step toward achieving the educational goal of explaining the nature and benefits of synchrotron radiation to audiences ranging from the general public (including students) to government officials to scientists who may be unfamiliar with x-ray techniques and synchrotron radiation. ESRF is the driving force behind a new CD-ROM playable on both PCs and Macs titled Synchrotron light to explore matter. Published by Springer-Verlag, the CD contains both English and French versions of a comprehensive overview of the subject.

  18. Hydraulic tool

    SciTech Connect

    Gregory, J.T.

    1988-04-05

    A hydraulic force-delivering tool including a cylinder, a piston slidable in the cylinder and a hydraulic pump to deliver fluid under pressure to the cylinder the hydraulic pump is described, comprising: a pump body; means forming a cylindrical chamber in the pump body; at least one inlet port opening into one end of the chamber from outside the body; means forming an outlet port at the other end of the chamber; a check valve in the outlet port enabling outward flow only; a pump rod plunger reciprocable through a given stroke in the chamber; inner and outer concentric cylindrical surfaces in the chamber and on the plunger, respectively; an annular shoulder on the chamber inner cylindrical surface facing toward the other end of the chamber; an annular seal member slidable along the pump rod and conditioned to seal against the shoulder; and spring means biasing the seal member toward the shoulder.

  19. Optical Tools

    NASA Astrophysics Data System (ADS)

    Roncali, E.; Tavitian, B.; Texier, I. E.; Peltié, P.; Perraut, F.; Boutet, J.; Cognet, L.; Lounis, B.; Marguet, D.; Thoumine, O.; Tramier, M.

    Fluorescence is a physical phenomenon described for the first time in 1852 by the British scientist George G. Stokes, famous for his work in mathematics and hydrodynamics. He observed the light emitted by a mineral after excitation (absorption of light by the mineral) by UV light. He then formulated what has become known as Stokes’ law, which says that the wavelength of fluorescence emission is longer than the excitation wavelength used to generate it. Some phenomena departing from this rule were later discovered, but do not in fact invalidate it. The possibility of visible excitation was subsequently developed, with the discovery of many fluorescing aromaticmolecules, called fluorophores. The identification of these compounds and improved control over the physical phenomenon meant that by 1930 research tools had been developed in biology, e.g., labeling certain tissues and bacteria so as to observe them by fluorescence. The optical microscope as it had existed since the nineteenth century thus gave rise to the fluorescence microscope: a reflection system to supply the light required to excite the fluorophores was added to the standard microscope, together with a suitable filtering system. Fluorescence microscopy soon became an important tool for biological analysis both in vitro and ex vivo, and other applications of light emission were also devised (light-emission phenomena of which fluorescence is a special case, described further in Sect. 7.2). It became possible to study phenomena that could not be observed by standard optical microscopy. Among other things, the location of molecules inside cells, monitoring of intracellular processes, and detection of single molecules all become feasible by means of fluorescence microscopy.

  20. Analytic integrable systems: Analytic normalization and embedding flows

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang

    In this paper we mainly study the existence of analytic normalization and the normal form of finite dimensional complete analytic integrable dynamical systems. More details, we will prove that any complete analytic integrable diffeomorphism F(x)=Bx+f(x) in (Cn,0) with B having eigenvalues not modulus 1 and f(x)=O(|) is locally analytically conjugate to its normal form. Meanwhile, we also prove that any complete analytic integrable differential system x˙=Ax+f(x) in (Cn,0) with A having nonzero eigenvalues and f(x)=O(|) is locally analytically conjugate to its normal form. Furthermore we will prove that any complete analytic integrable diffeomorphism defined on an analytic manifold can be embedded in a complete analytic integrable flow. We note that parts of our results are the improvement of Moser's one in J. Moser, The analytic invariants of an area-preserving mapping near a hyperbolic fixed point, Comm. Pure Appl. Math. 9 (1956) 673-692 and of Poincaré's one in H. Poincaré, Sur l'intégration des équations différentielles du premier order et du premier degré, II, Rend. Circ. Mat. Palermo 11 (1897) 193-239. These results also improve the ones in Xiang Zhang, Analytic normalization of analytic integrable systems and the embedding flows, J. Differential Equations 244 (2008) 1080-1092 in the sense that the linear part of the systems can be nonhyperbolic, and the one in N.T. Zung, Convergence versus integrability in Poincaré-Dulac normal form, Math. Res. Lett. 9 (2002) 217-228 in the way that our paper presents the concrete expression of the normal form in a restricted case.

  1. Limitless Tools

    ERIC Educational Resources Information Center

    Erickson, Paul W.

    2010-01-01

    With the rushing in of new technologies, facilities must be more flexible and adaptable to a variety of learning approaches. As personalized learning plans emerge with technology, new designs make learning possible anywhere at any time. In addition, the change from print to Web-based materials is creating an environment that focuses on…

  2. The Frontiers of Additive Manufacturing

    SciTech Connect

    Grote, Christopher John

    2016-03-03

    Additive manufacturing, more commonly known as 3-D printing, has become a ubiquitous tool in science for its precise control over mechanical design. For additive manufacturing to work, a 3-D structure is split into thin 2D slices, and then different physical properties, such as photo-polymerization or melting, are used to grow the sequential layers. The level of control allows not only for devices to be made with a variety of materials: e.g. plastics, metals, and quantum dots, but to also have finely controlled structures leading to other novel properties. While 3-D printing is widely used by hobbyists for making models, it also has industrial applications in structural engineering, biological tissue scaffolding, customized electric circuitry, fuel cells, security, and more.

  3. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  4. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points".

  5. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. R.; St. Clair, T. L.; Burks, H. D.; Stoakley, D. M.

    1987-01-01

    A method has been found for enhancing the melt flow of thermoplastic polyimides during processing. A high molecular weight 422 copoly(amic acid) or copolyimide was fused with approximately 0.05 to 5 pct by weight of a low molecular weight amic acid or imide additive, and this melt was studied by capillary rheometry. Excellent flow and improved composite properties on graphite resulted from the addition of a PMDA-aniline additive to LARC-TPI. Solution viscosity studies imply that amic acid additives temporarily lower molecular weight and, hence, enlarge the processing window. Thus, compositions containing the additive have a lower melt viscosity for a longer time than those unmodified.

  6. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects.

  7. Analytic streamline calculations on linear tetrahedra

    SciTech Connect

    Diachin, D.P.; Herzog, J.A.

    1997-06-01

    Analytic solutions for streamlines within tetrahedra are used to define operators that accurately and efficiently compute streamlines. The method presented here is based on linear interpolation, and therefore produces exact results for linear velocity fields. In addition, the method requires less computation than the forward Euler numerical method. Results are presented that compare accuracy measurements of the method with forward Euler and fourth order Runge-Kutta applied to both a linear and a nonlinear velocity field.

  8. Quality by design compliant analytical method validation.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-01-03

    The concept of quality by design (QbD) has recently been adopted for the development of pharmaceutical processes to ensure a predefined product quality. Focus on applying the QbD concept to analytical methods has increased as it is fully integrated within pharmaceutical processes and especially in the process control strategy. In addition, there is the need to switch from the traditional checklist implementation of method validation requirements to a method validation approach that should provide a high level of assurance of method reliability in order to adequately measure the critical quality attributes (CQAs) of the drug product. The intended purpose of analytical methods is directly related to the final decision that will be made with the results generated by these methods under study. The final aim for quantitative impurity assays is to correctly declare a substance or a product as compliant with respect to the corresponding product specifications. For content assays, the aim is similar: making the correct decision about product compliance with respect to their specification limits. It is for these reasons that the fitness of these methods should be defined, as they are key elements of the analytical target profile (ATP). Therefore, validation criteria, corresponding acceptance limits, and method validation decision approaches should be settled in accordance with the final use of these analytical procedures. This work proposes a general methodology to achieve this in order to align method validation within the QbD framework and philosophy. β-Expectation tolerance intervals are implemented to decide about the validity of analytical methods. The proposed methodology is also applied to the validation of analytical procedures dedicated to the quantification of impurities or active product ingredients (API) in drug substances or drug products, and its applicability is illustrated with two case studies.

  9. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  10. Design and Implementation of a Learning Analytics Toolkit for Teachers

    ERIC Educational Resources Information Center

    Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik

    2012-01-01

    Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…

  11. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  12. Making advanced analytics work for you.

    PubMed

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  13. Helios: Understanding Solar Evolution Through Text Analytics

    SciTech Connect

    Randazzese, Lucien

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  14. Tools for the study of dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Zhang, Fan

    This thesis covers a range of topics in numerical and analytical relativity, centered around introducing tools and methodologies for the study of dynamical spacetimes. The scope of the studies is limited to classical (as opposed to quantum) vacuum spacetimes described by Einstein's general theory of relativity. The numerical works presented here are carried out within the Spectral Einstein Code (SpEC) infrastructure, while analytical calculations extensively utilize Wolfram's Mathematica program. We begin by examining highly dynamical spacetimes such as binary black hole mergers, which can be investigated using numerical simulations. However, there are difficulties in interpreting the output of such simulations. One difficulty stems from the lack of a canonical coordinate system (henceforth referred to as gauge freedom) and tetrad, against which quantities such as Newman-Penrose Psi4 (usually interpreted as the gravitational wave part of curvature) should be measured. We tackle this problem in Chapter 2 by introducing a set of geometrically motivated coordinates that are independent of the simulation gauge choice, as well as a quasi-Kinnersley tetrad, also invariant under gauge changes in addition to being optimally suited to the task of gravitational wave extraction. Another difficulty arises from the need to condense the overwhelming amount of data generated by the numerical simulations. In order to extract physical information in a succinct and transparent manner, one may define a version of gravitational field lines and field strength using spatial projections of the Weyl curvature tensor. Introduction, investigation and utilization of these quantities will constitute the main content in Chapters 3 through 6. For the last two chapters, we turn to the analytical study of a simpler dynamical spacetime, namely a perturbed Kerr black hole. We will introduce in Chapter 7 a new analytical approximation to the quasi-normal mode (QNM) frequencies, and relate various

  15. Infrared Spectroscopy as a Chemical Fingerprinting Tool

    NASA Technical Reports Server (NTRS)

    Huff, Tim; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. The technique is rapid, reproducible and usually non-invasive. With the appropriate accessories, the technique can be used to examine samples in either a solid, liquid or gas phase. Solid samples of varying sizes and shapes may be used, and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be examined. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Both aqueous and non-aqueous free-flowing solutions can be analyzed using appropriate IR techniques, as can viscous liquids such as heavy oils and greases. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.

  16. Unexpected Analyte Oxidation during Desorption Electrospray Ionization - Mass Spectrometry

    SciTech Connect

    Pasilis, Sofie P; Kertesz, Vilmos; Van Berkel, Gary J

    2008-01-01

    During the analysis of surface spotted analytes using desorption electrospray ionization mass spectrometry (DESI-MS), abundant ions are sometimes observed that appear to be the result of oxygen addition reactions. In this investigation, the effect of sample aging, the ambient lab environment, spray voltage, analyte surface concentration, and surface type on this oxidative modification of spotted analytes, exemplified by tamoxifen and reserpine, during analysis by desorption electrospray ionization mass spectrometry was studied. Simple exposure of the samples to air and to ambient lighting increased the extent of oxidation. Increased spray voltage lead also to increased analyte oxidation, possibly as a result of oxidative species formed electrochemically at the emitter electrode or in the gas - phase by discharge processes. These oxidative species are carried by the spray and impinge on and react with the sampled analyte during desorption/ionization. The relative abundance of oxidized species was more significant for analysis of deposited analyte having a relatively low surface concentration. Increasing spray solvent flow rate and addition of hydroquinone as a redox buffer to the spray solvent were found to decrease, but not entirely eliminate, analyte oxidation during analysis. The major parameters that both minimize and maximize analyte oxidation were identified and DESI-MS operational recommendations to avoid these unwanted reactions are suggested.

  17. Improved analytic nutation model

    NASA Technical Reports Server (NTRS)

    Yoder, C. F.; Ivins, E. R.

    1988-01-01

    Models describing the earth's nutations are discussed. It is found that the simple model of Sasao et al., (1981) differs from Wahr's (1981) theory term by term by less than 0.3 marcsec if a modern earth structure model is used to evaluate the nutation structure constants. In addition, the effect of oceans is estimated.

  18. Tools for Authentication

    SciTech Connect

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  19. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  20. Polylactides in additive biomanufacturing.

    PubMed

    Poh, Patrina S P; Chhaya, Mohit P; Wunner, Felix M; De-Juan-Pardo, Elena M; Schilling, Arndt F; Schantz, Jan-Thorsten; van Griensven, Martijn; Hutmacher, Dietmar W

    2016-12-15

    New advanced manufacturing technologies under the alias of additive biomanufacturing allow the design and fabrication of a range of products from pre-operative models, cutting guides and medical devices to scaffolds. The process of printing in 3 dimensions of cells, extracellular matrix (ECM) and biomaterials (bioinks, powders, etc.) to generate in vitro and/or in vivo tissue analogue structures has been termed bioprinting. To further advance in additive biomanufacturing, there are many aspects that we can learn from the wider additive manufacturing (AM) industry, which have progressed tremendously since its introduction into the manufacturing sector. First, this review gives an overview of additive manufacturing and both industry and academia efforts in addressing specific challenges in the AM technologies to drive toward AM-enabled industrial revolution. After which, considerations of poly(lactides) as a biomaterial in additive biomanufacturing are discussed. Challenges in wider additive biomanufacturing field are discussed in terms of (a) biomaterials; (b) computer-aided design, engineering and manufacturing; (c) AM and additive biomanufacturing printers hardware; and (d) system integration. Finally, the outlook for additive biomanufacturing was discussed.

  1. Additive Manufactured Product Integrity

    NASA Technical Reports Server (NTRS)

    Waller, Jess; Wells, Doug; James, Steve; Nichols, Charles

    2017-01-01

    NASA is providing key leadership in an international effort linking NASA and non-NASA resources to speed adoption of additive manufacturing (AM) to meet NASA's mission goals. Participants include industry, NASA's space partners, other government agencies, standards organizations and academia. Nondestructive Evaluation (NDE) is identified as a universal need for all aspects of additive manufacturing.

  2. Analytic Model For Estimation Of Cold Bulk Metal Forming Simulations

    SciTech Connect

    Skunca, Marko; Keran, Zdenka; Math, Miljenko

    2007-05-17

    Numerical simulation of bulk metal forming plays an important role in predicting a key parameters in cold forging. Comparison of numerical and experimental data is of great importance, but there is always a need of more universal analytical tools. Therefore, many papers besides experiment and simulation of a particular bulk metal forming technology, include an analytic model. In this paper an analytical model for evaluation of commercially available simulation program packages is proposed. Based on elementary theory of plasticity, being only geometry dependent, model represents a good analytical reference to estimate given modeling preferences like; element types, solver, remeshing influence and many others. Obtained, geometry dependent, stress fields compared with numerical data give a clear picture of numerical possibilities and limitations of particular modeling program package.

  3. Analytical pharmacology: the impact of numbers on pharmacology.

    PubMed

    Kenakin, Terry; Christopoulos, Arthur

    2011-04-01

    Analytical pharmacology strives to compare pharmacological data to detailed quantitative models. The most famous tool in this regard is the Black/Leff operational model, which can be used to quantify agonism in a test system and predict it in any other system. Here we give examples of how and where analytical pharmacology has been used to classify drugs and predict mechanism of action in pharmacology. We argue for the importance of analytical pharmacology in drug classification and in prediction of drug mechanisms of action. Although some of the specifics of Black's models have been updated to account for new developments, the principles of analytical pharmacology should shape drug discovery for many years to come.

  4. Supporting tool suite for production proteomics

    PubMed Central

    Ma, Ze-Qiang; Tabb, David L.; Burden, Joseph; Chambers, Matthew C.; Cox, Matthew B.; Cantrell, Michael J.; Ham, Amy-Joan L.; Litton, Michael D.; Oreto, Michael R.; Schultz, William C.; Sobecki, Scott M.; Tsui, Tina Y.; Wernke, Gregory R.; Liebler, Daniel C.

    2011-01-01

    Summary: The large amount of data produced by proteomics experiments requires effective bioinformatics tools for the integration of data management and data analysis. Here we introduce a suite of tools developed at Vanderbilt University to support production proteomics. We present the Backup Utility Service tool for automated instrument file backup and the ScanSifter tool for data conversion. We also describe a queuing system to coordinate identification pipelines and the File Collector tool for batch copying analytical results. These tools are individually useful but collectively reinforce each other. They are particularly valuable for proteomics core facilities or research institutions that need to manage multiple mass spectrometers. With minor changes, they could support other types of biomolecular resource facilities. Availability and Implementation: Source code and executable versions are available under Apache 2.0 License at http://www.vicc.org/jimayersinstitute/data/ Contact: daniel.liebler@vanderbilt.edu PMID:21965817

  5. Nucleic acid tool enzymes-aided signal amplification strategy for biochemical analysis: status and challenges.

    PubMed

    Qing, Taiping; He, Dinggeng; He, Xiaoxiao; Wang, Kemin; Xu, Fengzhou; Wen, Li; Shangguan, Jingfang; Mao, Zhengui; Lei, Yanli

    2016-04-01

    Owing to their highly efficient catalytic effects and substrate specificity, the nucleic acid tool enzymes are applied as 'nano-tools' for manipulating different nucleic acid substrates both in the test-tube and in living organisms. In addition to the function as molecular scissors and molecular glue in genetic engineering, the application of nucleic acid tool enzymes in biochemical analysis has also been extensively developed in the past few decades. Used as amplifying labels for biorecognition events, the nucleic acid tool enzymes are mainly applied in nucleic acids amplification sensing, as well as the amplification sensing of biorelated variations of nucleic acids. With the introduction of aptamers, which can bind different target molecules, the nucleic acid tool enzymes-aided signal amplification strategies can also be used to sense non-nucleic targets (e.g., ions, small molecules, proteins, and cells). This review describes and discusses the amplification strategies of nucleic acid tool enzymes-aided biosensors for biochemical analysis applications. Various analytes, including nucleic acids, ions, small molecules, proteins, and cells, are reviewed briefly. This work also addresses the future trends and outlooks for signal amplification in nucleic acid tool enzymes-aided biosensors.

  6. The Case for Assessment Analytics

    ERIC Educational Resources Information Center

    Ellis, Cath

    2013-01-01

    Learning analytics is a relatively new field of inquiry and its precise meaning is both contested and fluid (Johnson, Smith, Willis, Levine & Haywood, 2011; LAK, n.d.). Ferguson (2012) suggests that the best working definition is that offered by the first Learning Analytics and Knowledge (LAK) conference: "the measurement, collection,…

  7. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR.

  8. Analytical solution of the simplified spherical harmonics equations in spherical turbid media

    NASA Astrophysics Data System (ADS)

    Edjlali, Ehsan; Bérubé-Lauzière, Yves

    2016-10-01

    We present for the first time an analytical solution for the simplified spherical harmonics equations (so-called SPN equations) in the case of a steady-state isotropic point source inside a spherical homogeneous absorbing and scattering medium. The SPN equations provide a reliable approximation to the radiative transfer equation for describing light transport inside turbid media. The SPN equations consist of a set of coupled partial differential equations and the eigen method is used to obtain a set of decoupled equations, each resembling the heat equation in the Laplace domain. The equations are solved for the realistic partial reflection boundary conditions accounting for the difference in refractive indices between the turbid medium and its environment (air) as occurs in practical cases of interest in biomedical optics. Specifically, we provide the complete solution methodology for the SP3, which is readily applicable to higher orders as well, and also give results for the SP5. This computationally easy to obtain solution is investigated for different optical properties of the turbid medium. For validation, the solution is also compared to the analytical solution of the diffusion equation and to gold standard Monte Carlo simulation results. The SP3 and SP5 analytical solutions prove to be in good agreement with the Monte Carlo results. This work provides an additional tool for validating numerical solutions of the SPN equations for curved geometries.

  9. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  10. THE FUTURE OF SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): 2006-2010

    EPA Science Inventory

    SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...

  11. Avionics System Architecture Tool

    NASA Technical Reports Server (NTRS)

    Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian

    2005-01-01

    Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.

  12. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  13. Visual analytics for multimodal social network analysis: a design study with social scientists.

    PubMed

    Ghani, Sohaib; Kwon, Bum Chul; Lee, Seungyoon; Yi, Ji Soo; Elmqvist, Niklas

    2013-12-01

    Social network analysis (SNA) is becoming increasingly concerned not only with actors and their relations, but also with distinguishing between different types of such entities. For example, social scientists may want to investigate asymmetric relations in organizations with strict chains of command, or incorporate non-actors such as conferences and projects when analyzing coauthorship patterns. Multimodal social networks are those where actors and relations belong to different types, or modes, and multimodal social network analysis (mSNA) is accordingly SNA for such networks. In this paper, we present a design study that we conducted with several social scientist collaborators on how to support mSNA using visual analytics tools. Based on an openended, formative design process, we devised a visual representation called parallel node-link bands (PNLBs) that splits modes into separate bands and renders connections between adjacent ones, similar to the list view in Jigsaw. We then used the tool in a qualitative evaluation involving five social scientists whose feedback informed a second design phase that incorporated additional network metrics. Finally, we conducted a second qualitative evaluation with our social scientist collaborators that provided further insights on the utility of the PNLBs representation and the potential of visual analytics for mSNA.

  14. ASSESS (Analytic System and Software for Evaluating Safeguards and Security) update: Current status and future developments

    SciTech Connect

    Al-Ayat, R.A. ); Cousins, T.D. ); Hoover, E.R. )

    1990-07-15

    The Analytic System and Software for Evaluating Safeguards and Security (ASSESS) has been released for use by DOE field offices and their contractors. In October, 1989, we offered a prototype workshop to selected representatives of the DOE community. Based on the prototype results, we held the first training workshop at the Central Training Academy in January, 1990. Four additional workshops are scheduled for FY 1990. ASSESS is a state-of-the-art analytical tool for management to conduct integrated evaluation of safeguards systems at facilities handling facilities. Currently, ASSESS focuses on the threat of theft/diversion of special nuclear material by insiders, outsiders, and a special form of insider/outsider collusion. ASSESS also includes a neutralization module. Development of the tool is continuing. Plans are underway to expand the capabilities of ASSESS to evaluate against violent insiders, to validate the databases, to expand the neutralization module, and to assist in demonstrating compliance with DOE Material Control and Accountability (MC A) Order 5633.3. These new capabilities include the ability to: compute a weighted average for performance capability against a spectrum of insider adversaries; conduct defense-in-depth analyses; and analyze against protracted theft scenarios. As they become available, these capabilities will be incorporated in our training program. ASSESS is being developed jointly by Lawrence Livermore and Sandia National Laboratories under the sponsorship of the Department of Energy (DOE) Office of Safeguards and Security.

  15. Group Analytic Psychotherapy in Brazil.

    PubMed

    Penna, Carla; Castanho, Pablo

    2015-10-01

    Group analytic practice in Brazil began quite early. Highly influenced by the Argentinean Pichon-Rivière, it enjoyed a major development from the 1950s to the early 1980s. Beginning in the 1970s, different factors undermined its development and eventually led to its steep decline. From the mid 1980s on, the number of people looking for either group analytic psychotherapy or group analytic training decreased considerably. Group analytic psychotherapy societies struggled to survive and most of them had to close their doors in the 1990s and the following decade. Psychiatric reform and the new public health system have stimulated a new demand for groups in Brazil. Developments in the public and not-for-profit sectors, combined with theoretical and practical research in universities, present promising new perspectives for group analytic psychotherapy in Brazil nowadays.

  16. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  17. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  18. Chemiluminescence microarrays in analytical chemistry: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2014-09-01

    Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.

  19. A Joint Analytic Method for Estimating Aquitard Hydraulic Parameters.

    PubMed

    Zhuang, Chao; Zhou, Zhifang; Illman, Walter A

    2017-01-10

    The vertical hydraulic conductivity (Kv ), elastic (Sske ), and inelastic (Sskv ) skeletal specific storage of aquitards are three of the most critical parameters in land subsidence investigations. Two new analytic methods are proposed to estimate the three parameters. The first analytic method is based on a new concept of delay time ratio for estimating Kv and Sske of an aquitard subject to long-term stable, cyclic hydraulic head changes at boundaries. The second analytic method estimates the Sskv of the aquitard subject to linearly declining hydraulic heads at boundaries. Both methods are based on analytical solutions for flow within the aquitard, and they are jointly employed to obtain the three parameter estimates. This joint analytic method is applied to estimate the Kv , Sske , and Sskv of a 34.54-m thick aquitard for which the deformation progress has been recorded by an extensometer located in Shanghai, China. The estimated results are then calibrated by PEST (Doherty 2005), a parameter estimation code coupled with a one-dimensional aquitard-drainage model. The Kv and Sske estimated by the joint analytic method are quite close to those estimated via inverse modeling and performed much better in simulating elastic deformation than the estimates obtained from the stress-strain diagram method of Ye and Xue (2005). The newly proposed joint analytic method is an effective tool that provides reasonable initial values for calibrating land subsidence models.

  20. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    SciTech Connect

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.; Riensche, Roderick M.; Franklin, Lyndsey; Pike, William A.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analytical components from information sources making it easier to adapt the framework for many different data repositories.

  1. Analytical services contract reform alternatives project

    SciTech Connect

    Hunt, J.W.; Fox, M.R.; Kristofzski, J.G.; Minette, M.J.

    1995-03-23

    Westinghouse Hanford Company (WHC) was directed by the U.S. Department of Energy, Richland Operations Office (DOE-RL) to examine the feasibility of outsourcing all or part of its laboratory and analytical functions as part of a contract reform effort. The analytical services provided by WHC were found to be significantly greater than that of a typical environmental laboratory which provides sample analysis based on a simple sample in-report out model. In addition to high-volume production analysis, the work scope includes special analytical services, technical consulting, sample handling and disposition, and special material preparations. Numerous broad ranging potential contract reform alternatives were identified and categorized into seven main alternatives with associated sub-alternatives. Issues associated with each alternative varied significantly depending on the alternative. Fifteen issues were identified and described including human resources, contract, and procurement areas. Readers of this report will perhaps identify additional alternatives and/or issues. In addressing the issues, it was determined that those issues pertaining to labor relations and procurement require major policy resolutions by WHC/DOE senior management prior to being able to establish meaningful assumptions for cost/benefit analyses of the seven alternatives. Further review was therefore stopped without economic analyses or recommendation for any specific alternative. Accordingly, this report is intended to fulfill the requirements of RL Milestone AS-95-016.

  2. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  3. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  4. Process Mapping: Tools, Techniques, & Critical Success Factors.

    ERIC Educational Resources Information Center

    Kalman, Howard K.

    2002-01-01

    Explains process mapping as an analytical tool and a process intervention that performance technologists can use to improve human performance by reducing error variance. Highlights include benefits of process mapping; and critical success factors, including organizational readiness, time commitment by participants, and the availability of a…

  5. A Tool To Assess Journal Price Discrimination.

    ERIC Educational Resources Information Center

    Meyer, Richard W.

    2001-01-01

    The author designed an experiment to determine whether periodical price inflation might be dampened by electronic scholarship. This article discusses results of an econometric analysis of prices for 859 periodical titles for three consecutive years, and concludes with a description of an analytical tool that may be used to assess journal prices.…

  6. Databases and tools in glycobiology.

    PubMed

    Artemenko, Natalia V; McDonald, Andrew G; Davey, Gavin P; Rudd, Pauline M

    2012-01-01

    Glycans are crucial to the functioning of multicellular organisms. They may also play a role as mediators between host and parasite or symbiont. As many proteins (>50%) are posttranslationally modified by glycosylation, this mechanism is considered to be the most widespread posttranslational modification in eukaryotes. These surface modifications alter and regulate structure and biological activities/functions of proteins/biomolecules as they are largely involved in the recognition process of the appropriate structure in order to bind to the target cells. Consequently, the recognition of glycans on cellular surfaces plays a crucial role in the promotion or inhibition of various diseases and, therefore, glycosylation itself is considered to be a critical protein quality control attribute for commercial therapeutics, which is one of the fastest growing segments in the pharmaceutical industry. With the development of glycobiology as a separate discipline, a number of databases and tools became available in a similar way to other well-established "omics." Alleviating the recognized shortcomings of the available tools for data storage and retrieval is one of the highest priorities of the international glycoinformatics community. In the last decade, major efforts have been made, by leading scientific groups, towards the integration of a number of major databases and tools into a single portal, which would act as a centralized data repository for glycomics, equipped with a number of comprehensive analytical tools for data systematization, analysis, and comparison. This chapter provides an overview of the most important carbohydrate-related databases and glycoinformatic tools.

  7. Health informatics and analytics - building a program to integrate business analytics across clinical and administrative disciplines.

    PubMed

    Tremblay, Monica Chiarini; Deckard, Gloria J; Klein, Richard

    2016-07-01

    Health care organizations must develop integrated health information systems to respond to the numerous government mandates driving the movement toward reimbursement models emphasizing value-based and accountable care. Success in this transition requires integrated data analytics, supported by the combination of health informatics, interoperability, business process design, and advanced decision support tools. This case study presents the development of a master's level cross- and multidisciplinary informatics program offered through a business school. The program provides students from diverse backgrounds with the knowledge, leadership, and practical application skills of health informatics, information systems, and data analytics that bridge the interests of clinical and nonclinical professionals. This case presents the actions taken and challenges encountered in navigating intra-university politics, specifying curriculum, recruiting the requisite interdisciplinary faculty, innovating the educational format, managing students with diverse educational and professional backgrounds, and balancing multiple accreditation agencies.

  8. Analytical Chemistry Laboratory progress report for FY 1985

    SciTech Connect

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.

  9. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources.

  10. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  11. CoNNECT: Data Analytics for Energy Efficient Communities

    SciTech Connect

    Omitaomu, Olufemi A; Bhaduri, Budhendra L; Kodysh, Jeffrey B

    2012-01-01

    Energy efficiency is the lowest cost option being promoted for achieving a sustainable energy policy. Thus, there have been some innovations to reduce residential and commercial energy usage. There have also been calls to the utility companies to give customers access to timely, useful, and actionable information about their energy use, in order to unleash additional innovations in homes and businesses. Hence, some web-based tools have been developed for the public to access and compare energy usage data. In order to advance on these efforts, we propose a data analytics framework called Citizen Engagement for Energy Efficient Communities (CoNNECT). On the one hand, CoNNECT will help households to understand (i) the patterns in their energy consumption over time and how those patterns correlate with weather data, (ii) how their monthly consumption compares to other households living in houses of similar size and age within the same geographic areas, and (iii) what other customers are doing to reduce their energy consumption. We hope that the availability of such data and analysis to the public will facilitate energy efficiency efforts in residential buildings. These capabilities formed the public portal of the CoNNECT framework. On the other hand, CoNNECT will help the utility companies to better understand their customers by making available to the utilities additional datasets that they naturally do not have access to, which could help them develop focused services for their customers. These additional capabilities are parts of the utility portal of the CoNNECT framework. In this paper, we describe the CoNNECT framework, the sources of the data used in its development, the functionalities of both the public and utility portals, and the application of empirical mode decomposition for decomposing usage signals into mode functions with the hope that such mode functions could help in clustering customers into unique groups and in developing guidelines for energy

  12. Visualizing Flow of Uncertainty through Analytical Processes.

    PubMed

    Wu, Yingcai; Yuan, Guo-Xun; Ma, Kwan-Liu

    2012-12-01

    Uncertainty can arise in any stage of a visual analytics process, especially in data-intensive applications with a sequence of data transformations. Additionally, throughout the process of multidimensional, multivariate data analysis, uncertainty due to data transformation and integration may split, merge, increase, or decrease. This dynamic characteristic along with other features of uncertainty pose a great challenge to effective uncertainty-aware visualization. This paper presents a new framework for modeling uncertainty and characterizing the evolution of the uncertainty information through analytical processes. Based on the framework, we have designed a visual metaphor called uncertainty flow to visually and intuitively summarize how uncertainty information propagates over the whole analysis pipeline. Our system allows analysts to interact with and analyze the uncertainty information at different levels of detail. Three experiments were conducted to demonstrate the effectiveness and intuitiveness of our design.

  13. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  14. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  15. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1

  16. Coastal On-line Assessment and Synthesis Tool 2.0

    NASA Technical Reports Server (NTRS)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  17. Coherent pulsed excitation of degenerate multistate systems: Exact analytic solutions

    SciTech Connect

    Kyoseva, E. S.; Vitanov, N. V.

    2006-02-15

    We show that the solution of a multistate system composed of N degenerate lower (ground) states and one upper (excited) state can be reduced by using the Morris-Shore transformation to the solution of a two-state system involving only the excited state and a (bright) superposition of ground states. In addition, there are N-1 dark states composed of ground states. We use this decomposition to derive analytical solutions for degenerate extensions of the most popular exactly soluble models: the resonance solution, the Rabi, Landau-Zener, Rosen-Zener, Allen-Eberly, and Demkov-Kunike models. We suggest various applications of the multistate solutions, for example, as tools for creating multistate coherent superpositions by generalized resonant {pi} pulses. We show that such generalized {pi} pulses can occur even when the upper state is far off resonance, at specific detunings, which makes it possible to operate in the degenerate ground-state manifold without populating the (possibly lossy) upper state, even transiently.

  18. Group Sparse Additive Models

    PubMed Central

    Yin, Junming; Chen, Xi; Xing, Eric P.

    2016-01-01

    We consider the problem of sparse variable selection in nonparametric additive models, with the prior knowledge of the structure among the covariates to encourage those variables within a group to be selected jointly. Previous works either study the group sparsity in the parametric setting (e.g., group lasso), or address the problem in the nonparametric setting without exploiting the structural information (e.g., sparse additive models). In this paper, we present a new method, called group sparse additive models (GroupSpAM), which can handle group sparsity in additive models. We generalize the ℓ1/ℓ2 norm to Hilbert spaces as the sparsity-inducing penalty in GroupSpAM. Moreover, we derive a novel thresholding condition for identifying the functional sparsity at the group level, and propose an efficient block coordinate descent algorithm for constructing the estimate. We demonstrate by simulation that GroupSpAM substantially outperforms the competing methods in terms of support recovery and prediction accuracy in additive models, and also conduct a comparative experiment on a real breast cancer dataset.

  19. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  20. Additional considerations on electrolysis in electromembrane extraction.

    PubMed

    Šlampová, Andrea; Kubáň, Pavel; Boček, Petr

    2016-01-15

    Optimized acceptor solutions, which eliminate electrolytically induced variations in their pH values, have been shown to improve electromembrane extraction (EME) performance. Acceptor solutions containing 500 mM formic acid (pH 1.97) ensured stable EME process for three basic drugs extracted at 50 V across 1-ethyl-2-nitrobenzene and constant extraction recoveries (66-89%) were achieved for 40-80 min EMEs. Back-extraction of analytes into donor solutions has been eliminated by application of optimized acceptor solutions, moreover, saturation of acceptor solutions with analytes had no additional effect on their back-extraction; the presence of up to 300-fold excess of analytes in optimized acceptor solutions led to slightly reduced but stable enrichment of analytes over the entire extraction time. Stable EME performance has been also achieved for extractions into 100mM HCl, note however, that seriously compromised performance of subsequent capillary electrophoretic analyses has been observed due to high conductivities of resulting acceptor solutions. Electrolytically produced H(+) and OH(-) ions have mostly remained in corresponding operating solutions, have determined their final pH values and have not been subjects of EME transfers across selective phase interfaces as was experimentally verified by pH measurements of anolytes and catholytes at various EME times.