Sample records for nuclear methods quantification

  1. Nuclear Data Uncertainty Quantification: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  2. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  3. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D.L., E-mail: Donald.L.Smith@anl.gov

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  4. OR14-V-Uncertainty-PD2La Uncertainty Quantification for Nuclear Safeguards and Nondestructive Assay Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, Andrew D.; Croft, Stephen; McElroy, Robert Dennis

    2017-08-01

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically provide error bars and also partition total uncertainty into “random” and “systematic” components so that, for example, an error bar can be developed for the total mass estimate in multiple items. Uncertainty Quantification (UQ) for NDA has always been important, but itmore » is recognized that greater rigor is needed and achievable using modern statistical methods.« less

  5. Novel approach for the simultaneous detection of DNA from different fish species based on a nuclear target: quantification potential.

    PubMed

    Prado, Marta; Boix, Ana; von Holst, Christoph

    2012-07-01

    The development of DNA-based methods for the identification and quantification of fish in food and feed samples is frequently focused on a specific fish species and/or on the detection of mitochondrial DNA of fish origin. However, a quantitative method for the most common fish species used by the food and feed industry is needed for official control purposes, and such a method should rely on the use of a single-copy nuclear DNA target owing to its more stable copy number in different tissues. In this article, we report on the development of a real-time PCR method based on the use of a nuclear gene as a target for the simultaneous detection of fish DNA from different species and on the evaluation of its quantification potential. The method was tested in 22 different fish species, including those most commonly used by the food and feed industry, and in negative control samples, which included 15 animal species and nine feed ingredients. The results show that the method reported here complies with the requirements concerning specificity and with the criteria required for real-time PCR methods with high sensitivity.

  6. Comparison of deterministic and stochastic approaches for isotopic concentration and decay heat uncertainty quantification on elementary fission pulse

    NASA Astrophysics Data System (ADS)

    Lahaye, S.; Huynh, T. D.; Tsilanizara, A.

    2016-03-01

    Uncertainty quantification of interest outputs in nuclear fuel cycle is an important issue for nuclear safety, from nuclear facilities to long term deposits. Most of those outputs are functions of the isotopic vector density which is estimated by fuel cycle codes, such as DARWIN/PEPIN2, MENDEL, ORIGEN or FISPACT. CEA code systems DARWIN/PEPIN2 and MENDEL propagate by two different methods the uncertainty from nuclear data inputs to isotopic concentrations and decay heat. This paper shows comparisons between those two codes on a Uranium-235 thermal fission pulse. Effects of nuclear data evaluation's choice (ENDF/B-VII.1, JEFF-3.1.1 and JENDL-2011) is inspected in this paper. All results show good agreement between both codes and methods, ensuring the reliability of both approaches for a given evaluation.

  7. Optically transmitted and inductively coupled electric reference to access in vivo concentrations for quantitative proton-decoupled ¹³C magnetic resonance spectroscopy.

    PubMed

    Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke

    2012-01-01

    This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.

  8. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald; El-Azab, Anter; Pernice, Michael

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less

  9. Proceedings of the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering - M and C 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2013-07-01

    The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification.

  10. Nuclear Forensics and Radiochemistry: Radiation Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rundberg, Robert S.

    Radiation detection is necessary for isotope identification and assay in nuclear forensic applications. The principles of operation of gas proportional counters, scintillation counters, germanium and silicon semiconductor counters will be presented. Methods for calibration and potential pitfalls in isotope quantification will be described.

  11. DICOM image quantification secondary capture (DICOM IQSC) integrated with numeric results, regions, and curves: implementation and applications in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan

    2017-03-01

    In this paper, we describe an enhanced DICOM Secondary Capture (SC) that integrates Image Quantification (IQ) results, Regions of Interest (ROIs), and Time Activity Curves (TACs) with screen shots by embedding extra medical imaging information into a standard DICOM header. A software toolkit of DICOM IQSC has been developed to implement the SC-centered information integration of quantitative analysis for routine practice of nuclear medicine. Primary experiments show that the DICOM IQSC method is simple and easy to implement seamlessly integrating post-processing workstations with PACS for archiving and retrieving IQ information. Additional DICOM IQSC applications in routine nuclear medicine and clinic research are also discussed.

  12. Inventory Uncertainty Quantification using TENDL Covariance Data in Fispact-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eastwood, J.W.; Morgan, J.G.; Sublet, J.-Ch., E-mail: jean-christophe.sublet@ccfe.ac.uk

    2015-01-15

    The new inventory code Fispact-II provides predictions of inventory, radiological quantities and their uncertainties using nuclear data covariance information. Central to the method is a novel fast pathways search algorithm using directed graphs. The pathways output provides (1) an aid to identifying important reactions, (2) fast estimates of uncertainties, (3) reduced models that retain important nuclides and reactions for use in the code's Monte Carlo sensitivity analysis module. Described are the methods that are being implemented for improving uncertainty predictions, quantification and propagation using the covariance data that the recent nuclear data libraries contain. In the TENDL library, above themore » upper energy of the resolved resonance range, a Monte Carlo method in which the covariance data come from uncertainties of the nuclear model calculations is used. The nuclear data files are read directly by FISPACT-II without any further intermediate processing. Variance and covariance data are processed and used by FISPACT-II to compute uncertainties in collapsed cross sections, and these are in turn used to predict uncertainties in inventories and all derived radiological data.« less

  13. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    PubMed Central

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  14. A Short Interspersed Nuclear Element (SINE)-Based Real-Time PCR Approach to Detect and Quantify Porcine Component in Meat Products.

    PubMed

    Zhang, Chi; Fang, Xin; Qiu, Haopu; Li, Ning

    2015-01-01

    Real-time PCR amplification of mitochondria gene could not be used for DNA quantification, and that of single copy DNA did not allow an ideal sensitivity. Moreover, cross-reactions among similar species were commonly observed in the published methods amplifying repetitive sequence, which hindered their further application. The purpose of this study was to establish a short interspersed nuclear element (SINE)-based real-time PCR approach having high specificity for species detection that could be used in DNA quantification. After massive screening of candidate Sus scrofa SINEs, one optimal combination of primers and probe was selected, which had no cross-reaction with other common meat species. LOD of the method was 44 fg DNA/reaction. Further, quantification tests showed this approach was practical in DNA estimation without tissue variance. Thus, this study provided a new tool for qualitative detection of porcine component, which could be promising in the QC of meat products.

  15. A rapid and accurate quantification method for real-time dynamic analysis of cellular lipids during microalgal fermentation processes in Chlorella protothecoides with low field nuclear magnetic resonance.

    PubMed

    Wang, Tao; Liu, Tingting; Wang, Zejian; Tian, Xiwei; Yang, Yi; Guo, Meijin; Chu, Ju; Zhuang, Yingping

    2016-05-01

    The rapid and real-time lipid determination can provide valuable information on process regulation and optimization in the algal lipid mass production. In this study, a rapid, accurate and precise quantification method of in vivo cellular lipids of Chlorella protothecoides using low field nuclear magnetic resonance (LF-NMR) was newly developed. LF-NMR was extremely sensitive to the algal lipids with the limits of the detection (LOD) of 0.0026g and 0.32g/L in dry lipid samples and algal broth, respectively, as well as limits of quantification (LOQ) of 0.0093g and 1.18g/L. Moreover, the LF-NMR signal was specifically proportional to the cellular lipids of C. protothecoides, thus the superior regression curves existing in a wide detection range from 0.02 to 0.42g for dry lipids and from 1.12 to 8.97gL(-1) of lipid concentration for in vivo lipid quantification were obtained with all R(2) higher than 0.99, irrespective of the lipid content and fatty acids profile variations. The accuracy of this novel method was further verified to be reliable by comparing lipid quantification results to those obtained by GC-MS. And the relative standard deviation (RSD) of LF-NMR results were smaller than 2%, suggesting the precision of this method. Finally, this method was successfully used in the on-line lipid monitoring during the algal lipid fermentation processes, making it possible for better understanding of the lipid accumulation mechanism and dynamic bioprocess control. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Comparison of Calibration of Sensors Used for the Quantification of Nuclear Energy Rate Deposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, J.; Reynard-Carette, C.; Tarchalski, M.

    This present work deals with a collaborative program called GAMMA-MAJOR 'Development and qualification of a deterministic scheme for the evaluation of GAMMA heating in MTR reactors with exploitation as example MARIA reactor and Jules Horowitz Reactor' between the National Centre for Nuclear Research of Poland, the French Atomic Energy and Alternative Energies Commission and Aix Marseille University. One of main objectives of this program is to optimize the nuclear heating quantification thanks to calculation validated from experimental measurements of radiation energy deposition carried out in irradiation reactors. The quantification of the nuclear heating is a key data especially for themore » thermal, mechanical design and sizing of irradiation experimental devices in specific irradiated conditions and locations. The determination of this data is usually performed by differential calorimeters and gamma thermometers such as used in the experimental multi-sensors device called CARMEN 'Calorimetric en Reacteur et Mesures des Emissions Nucleaires'. In the framework of the GAMMA-MAJOR program a new calorimeter was designed for the nuclear energy deposition quantification. It corresponds to a single-cell calorimeter and it is called KAROLINA. This calorimeter was recently tested during an irradiation campaign inside MARIA reactor in Poland. This new single-cell calorimeter differs from previous CALMOS or CARMEN type differential calorimeters according to three main points: its geometry, its preliminary out-of-pile calibration, and its in-pile measurement method. The differential calorimeter, which is made of two identical cells containing heaters, has a calibration method based on the use of steady thermal states reached by simulating the nuclear energy deposition into the calorimeter sample by Joule effect; whereas the single-cell calorimeter, which has no heater, is calibrated by using the transient thermal response of the sensor (heating and cooling steps). The paper will concern these two kinds of calorimetric sensors. It will focus in particular on studies on their out-of-pile calibrations. Firstly, the characteristics of the sensor designs will be detailed (such as geometry, dimension, material sample, assembly, instrumentation). Then the out-of-pile calibration methods will be described. Furthermore numerical results obtained thanks to 2D axisymmetrical thermal simulations (Finite Element Method, CAST3M) and experimental results will be presented for each sensor. A comparison of the two different thermal sensor behaviours will be realized. To conclude a discussion of the advantages and the drawbacks of each sensor will be performed especially regarding measurement methods. (authors)« less

  17. Roundness variation in JPEG images affects the automated process of nuclear immunohistochemical quantification: correction with a linear regression model.

    PubMed

    López, Carlos; Jaén Martinez, Joaquín; Lejeune, Marylène; Escrivà, Patricia; Salvadó, Maria T; Pons, Lluis E; Alvaro, Tomás; Baucells, Jordi; García-Rojo, Marcial; Cugat, Xavier; Bosch, Ramón

    2009-10-01

    The volume of digital image (DI) storage continues to be an important problem in computer-assisted pathology. DI compression enables the size of files to be reduced but with the disadvantage of loss of quality. Previous results indicated that the efficiency of computer-assisted quantification of immunohistochemically stained cell nuclei may be significantly reduced when compressed DIs are used. This study attempts to show, with respect to immunohistochemically stained nuclei, which morphometric parameters may be altered by the different levels of JPEG compression, and the implications of these alterations for automated nuclear counts, and further, develops a method for correcting this discrepancy in the nuclear count. For this purpose, 47 DIs from different tissues were captured in uncompressed TIFF format and converted to 1:3, 1:23 and 1:46 compression JPEG images. Sixty-five positive objects were selected from these images, and six morphological parameters were measured and compared for each object in TIFF images and those of the different compression levels using a set of previously developed and tested macros. Roundness proved to be the only morphological parameter that was significantly affected by image compression. Factors to correct the discrepancy in the roundness estimate were derived from linear regression models for each compression level, thereby eliminating the statistically significant differences between measurements in the equivalent images. These correction factors were incorporated in the automated macros, where they reduced the nuclear quantification differences arising from image compression. Our results demonstrate that it is possible to carry out unbiased automated immunohistochemical nuclear quantification in compressed DIs with a methodology that could be easily incorporated in different systems of digital image analysis.

  18. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. JOe; Ronald L. Boring

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understandmore » from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.« less

  19. Advanced Stochastic Collocation Methods for Polynomial Chaos in RAVEN

    NASA Astrophysics Data System (ADS)

    Talbot, Paul W.

    As experiment complexity in fields such as nuclear engineering continually increases, so does the demand for robust computational methods to simulate them. In many simulations, input design parameters and intrinsic experiment properties are sources of uncertainty. Often small perturbations in uncertain parameters have significant impact on the experiment outcome. For instance, in nuclear fuel performance, small changes in fuel thermal conductivity can greatly affect maximum stress on the surrounding cladding. The difficulty quantifying input uncertainty impact in such systems has grown with the complexity of numerical models. Traditionally, uncertainty quantification has been approached using random sampling methods like Monte Carlo. For some models, the input parametric space and corresponding response output space is sufficiently explored with few low-cost calculations. For other models, it is computationally costly to obtain good understanding of the output space. To combat the expense of random sampling, this research explores the possibilities of using advanced methods in Stochastic Collocation for generalized Polynomial Chaos (SCgPC) as an alternative to traditional uncertainty quantification techniques such as Monte Carlo (MC) and Latin Hypercube Sampling (LHS) methods for applications in nuclear engineering. We consider traditional SCgPC construction strategies as well as truncated polynomial spaces using Total Degree and Hyperbolic Cross constructions. We also consider applying anisotropy (unequal treatment of different dimensions) to the polynomial space, and offer methods whereby optimal levels of anisotropy can be approximated. We contribute development to existing adaptive polynomial construction strategies. Finally, we consider High-Dimensional Model Reduction (HDMR) expansions, using SCgPC representations for the subspace terms, and contribute new adaptive methods to construct them. We apply these methods on a series of models of increasing complexity. We use analytic models of various levels of complexity, then demonstrate performance on two engineering-scale problems: a single-physics nuclear reactor neutronics problem, and a multiphysics fuel cell problem coupling fuels performance and neutronics. Lastly, we demonstrate sensitivity analysis for a time-dependent fuels performance problem. We demonstrate the application of all the algorithms in RAVEN, a production-level uncertainty quantification framework.

  20. EPRI/NRC-RES fire human reliability analysis guidelines.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan

    2010-03-01

    During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less

  1. A Pulse Coupled Neural Network Segmentation Algorithm for Reflectance Confocal Images of Epithelial Tissue

    PubMed Central

    Malik, Bilal H.; Jabbour, Joey M.; Maitland, Kristen C.

    2015-01-01

    Automatic segmentation of nuclei in reflectance confocal microscopy images is critical for visualization and rapid quantification of nuclear-to-cytoplasmic ratio, a useful indicator of epithelial precancer. Reflectance confocal microscopy can provide three-dimensional imaging of epithelial tissue in vivo with sub-cellular resolution. Changes in nuclear density or nuclear-to-cytoplasmic ratio as a function of depth obtained from confocal images can be used to determine the presence or stage of epithelial cancers. However, low nuclear to background contrast, low resolution at greater imaging depths, and significant variation in reflectance signal of nuclei complicate segmentation required for quantification of nuclear-to-cytoplasmic ratio. Here, we present an automated segmentation method to segment nuclei in reflectance confocal images using a pulse coupled neural network algorithm, specifically a spiking cortical model, and an artificial neural network classifier. The segmentation algorithm was applied to an image model of nuclei with varying nuclear to background contrast. Greater than 90% of simulated nuclei were detected for contrast of 2.0 or greater. Confocal images of porcine and human oral mucosa were used to evaluate application to epithelial tissue. Segmentation accuracy was assessed using manual segmentation of nuclei as the gold standard. PMID:25816131

  2. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malik, Afshan N., E-mail: afshan.malik@kcl.ac.uk; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana

    2011-08-19

    Highlights: {yields} Mitochondrial dysfunction is central to many diseases of oxidative stress. {yields} 95% of the mitochondrial genome is duplicated in the nuclear genome. {yields} Dilution of untreated genomic DNA leads to dilution bias. {yields} Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that themore » methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as {beta}-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.« less

  3. Automatic quantification of morphological features for hepatic trabeculae analysis in stained liver specimens

    PubMed Central

    Ishikawa, Masahiro; Murakami, Yuri; Ahi, Sercan Taha; Yamaguchi, Masahiro; Kobayashi, Naoki; Kiyuna, Tomoharu; Yamashita, Yoshiko; Saito, Akira; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2016-01-01

    Abstract. This paper proposes a digital image analysis method to support quantitative pathology by automatically segmenting the hepatocyte structure and quantifying its morphological features. To structurally analyze histopathological hepatic images, we isolate the trabeculae by extracting the sinusoids, fat droplets, and stromata. We then measure the morphological features of the extracted trabeculae, divide the image into cords, and calculate the feature values of the local cords. We propose a method of calculating the nuclear–cytoplasmic ratio, nuclear density, and number of layers using the local cords. Furthermore, we evaluate the effectiveness of the proposed method using surgical specimens. The proposed method was found to be an effective method for the quantification of the Edmondson grade. PMID:27335894

  4. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE PAGES

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  5. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch; Gallati, Sabina, E-mail: sabina.gallati@insel.ch; Schaller, Andre, E-mail: andre.schaller@insel.ch

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serialmore » qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two genomes is demonstrated and which evaluates systematically the impact of DNA degradation on quantification of mtDNA copy number.« less

  6. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less

  7. Rapid flow cytometric measurement of protein inclusions and nuclear trafficking

    PubMed Central

    Whiten, D. R.; San Gil, R.; McAlary, L.; Yerbury, J. J.; Ecroyd, H.; Wilson, M. R.

    2016-01-01

    Proteinaceous cytoplasmic inclusions are an indicator of dysfunction in normal cellular proteostasis and a hallmark of many neurodegenerative diseases. We describe a simple and rapid new flow cytometry-based method to enumerate, characterise and, if desired, physically recover protein inclusions from cells. This technique can analyse and resolve a broad variety of inclusions differing in both size and protein composition, making it applicable to essentially any model of intracellular protein aggregation. The method also allows rapid quantification of the nuclear trafficking of fluorescently labelled molecules. PMID:27516358

  8. Dakota Uncertainty Quantification Methods Applied to the CFD code Nek5000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier; Popov, Emilian L.; Pointer, William David

    This report presents the state of advancement of a Nuclear Energy Advanced Modeling and Simulation (NEAMS) project to characterize the uncertainty of the computational fluid dynamics (CFD) code Nek5000 using the Dakota package for flows encountered in the nuclear engineering industry. Nek5000 is a high-order spectral element CFD code developed at Argonne National Laboratory for high-resolution spectral-filtered large eddy simulations (LESs) and unsteady Reynolds-averaged Navier-Stokes (URANS) simulations.

  9. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    PubMed

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Correlation of X-ray computed tomography with quantitative nuclear magnetic resonance methods for pre-clinical measurement of adipose and lean tissues in living mice.

    PubMed

    Metzinger, Matthew N; Miramontes, Bernadette; Zhou, Peng; Liu, Yueying; Chapman, Sarah; Sun, Lucy; Sasser, Todd A; Duffield, Giles E; Stack, M Sharon; Leevy, W Matthew

    2014-10-08

    Numerous obesity studies have coupled murine models with non-invasive methods to quantify body composition in longitudinal experiments, including X-ray computed tomography (CT) or quantitative nuclear magnetic resonance (QMR). Both microCT and QMR have been separately validated with invasive techniques of adipose tissue quantification, like post-mortem fat extraction and measurement. Here we report a head-to-head study of both protocols using oil phantoms and mouse populations to determine the parameters that best align CT data with that from QMR. First, an in vitro analysis of oil/water mixtures was used to calibrate and assess the overall accuracy of microCT vs. QMR data. Next, experiments were conducted with two cohorts of living mice (either homogenous or heterogeneous by sex, age and genetic backgrounds) to assess the microCT imaging technique for adipose tissue segmentation and quantification relative to QMR. Adipose mass values were obtained from microCT data with three different resolutions, after which the data were analyzed with different filter and segmentation settings. Strong linearity was noted between the adipose mass values obtained with microCT and QMR, with optimal parameters and scan conditions reported herein. Lean tissue (muscle, internal organs) was also segmented and quantified using the microCT method relative to the analogous QMR values. Overall, the rigorous calibration and validation of the microCT method for murine body composition, relative to QMR, ensures its validity for segmentation, quantification and visualization of both adipose and lean tissues.

  11. Study of boron detection limit using the in-air PIGE set-up at LAMFI-USP

    NASA Astrophysics Data System (ADS)

    Moro, M. V.; Silva, T. F.; Trindade, G. F.; Added, N.; Tabacniks, M. H.

    2014-11-01

    The quantification of small amounts of boron in materials is of extreme importance in different areas of materials science. Boron is an important contaminant and also a silicon dopant in the semiconductor industry. Boron is also extensively used in nuclear power plants, either for neutron shielding or for safety control and boron is an essential nutrient for life, either vegetable or animal. The production of silicon solar cells, by refining metallurgical-grade silicon (MG-Si) requires the control and reduction of several silicon contaminants to very low concentration levels. Boron is one of the contaminants of solar-grade silicon (SG-Si) that must be controlled and quantified at sub-ppm levels. In the metallurgical purification, boron quantification is usually made by Inductive Coupled Plasma Mass Spectrometry, (ICP-MS) but the results need to be verified by an independent analytical method. In this work we present the results of the analysis of silicon samples by Particle Induced Gamma-Ray Emission (PIGE) aiming the quantification of low concentrations of boron. PIGE analysis was carried out using the in-air external beam line of the Laboratory for Materials Analysis with Ion Beans (LAMFI-USP) by the 10B ( p ,αγ(7Be nuclear reaction, and measuring the 429 keV γ-ray. The in-air PIGE measurements at LAMFI have a quantification limit of the order of 1016 at/cm2.

  12. Reduced Order Modeling Methods for Turbomachinery Design

    DTIC Science & Technology

    2009-03-01

    and Ma- terials Conference, May 2006. [45] A. Gelman , J. B. Carlin, H. S. Stern, and D. B. Rubin, Bayesian Data Analysis. New York, NY: Chapman I& Hall...Macian- Juan , and R. Chawla, “A statistical methodology for quantif ca- tion of uncertainty in best estimate code physical models,” Annals of Nuclear En

  13. Quantification of Forecasting and Change-Point Detection Methods for Predictive Maintenance

    DTIC Science & Technology

    2015-08-19

    industries to manage the service life of equipment, and also to detect precursors to the failure of components found in nuclear power plants, wind turbines ...detection methods for predictive maintenance 5a. CONTRACT NUMBER FA2386-14-1-4096 5b. GRANT NUMBER Grant 14IOA015 AOARD-144096 5c. PROGRAM ELEMENT...sensitive to changes related to abnormality. 15. SUBJECT TERMS predictive maintenance , predictive maintenance , forecasting 16

  14. Assessment of the announced North Korean nuclear test using long-range atmospheric transport and dispersion modelling.

    PubMed

    De Meutter, Pieter; Camps, Johan; Delcloo, Andy; Termonia, Piet

    2017-08-18

    On 6 January 2016, the Democratic People's Republic of Korea announced to have conducted its fourth nuclear test. Analysis of the corresponding seismic waves from the Punggye-ri nuclear test site showed indeed that an underground man-made explosion took place, although the nuclear origin of the explosion needs confirmation. Seven weeks after the announced nuclear test, radioactive xenon was observed in Japan by a noble gas measurement station of the International Monitoring System. In this paper, atmospheric transport modelling is used to show that the measured radioactive xenon is compatible with a delayed release from the Punggye-ri nuclear test site. An uncertainty quantification on the modelling results is given by using the ensemble method. The latter is important for policy makers and helps advance data fusion, where different nuclear Test-Ban-Treaty monitoring techniques are combined.

  15. Uncertainty quantification and propagation in nuclear density functional theory

    DOE PAGES

    Schunck, N.; McDonnell, J. D.; Higdon, D.; ...

    2015-12-23

    Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this study, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statisticalmore » analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.« less

  16. Quantification of Soluble Sugars and Sugar Alcohols by LC-MS/MS.

    PubMed

    Feil, Regina; Lunn, John Edward

    2018-01-01

    Sugars are simple carbohydrates composed primarily of carbon, hydrogen, and oxygen. They play a central role in metabolism as sources of energy and as building blocks for synthesis of structural and nonstructural polymers. Many different techniques have been used to measure sugars, including refractometry, colorimetric and enzymatic assays, gas chromatography, high-performance liquid chromatography, and nuclear magnetic resonance spectroscopy. In this chapter we describe a method that combines an initial separation of sugars by high-performance anion-exchange chromatography (HPAEC) with detection and quantification by tandem mass spectrometry (MS/MS). This combination of techniques provides exquisite specificity, allowing measurement of a diverse range of high- and low-abundance sugars in biological samples. This method can also be used for isotopomer analysis in stable-isotope labeling experiments to measure metabolic fluxes.

  17. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  18. A comparative uncertainty study of the calibration of macrolide antibiotic reference standards using quantitative nuclear magnetic resonance and mass balance methods.

    PubMed

    Liu, Shu-Yu; Hu, Chang-Qin

    2007-10-17

    This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.

  19. Determination of the purity of pharmaceutical reference materials by 1H NMR using the standardless PULCON methodology.

    PubMed

    Monakhova, Yulia B; Kohl-Himmelseher, Matthias; Kuballa, Thomas; Lachenmeier, Dirk W

    2014-11-01

    A fast and reliable nuclear magnetic resonance spectroscopic method for quantitative determination (qNMR) of targeted molecules in reference materials has been established using the ERETIC2 methodology (electronic reference to access in vivo concentrations) based on the PULCON principle (pulse length based concentration determination). The developed approach was validated for the analysis of pharmaceutical samples in the context of official medicines control, including ibandronic acid, amantadine, ambroxol and lercanidipine. The PULCON recoveries were above 94.3% and coefficients of variation (CVs) obtained by quantification of different targeted resonances ranged between 0.7% and 2.8%, demonstrating that the qNMR method is a precise tool for rapid quantification (approximately 15min) of reference materials and medicinal products. Generally, the values were within specification (certified values) provided by the manufactures. The results were in agreement with NMR quantification using an internal standard and validated reference HPLC analysis. The PULCON method was found to be a practical alternative with competitive precision and accuracy to the classical internal reference method and it proved to be applicable to different solvent conditions. The method can be recommended for routine use in medicines control laboratories, especially when the availability and costs of reference compounds are problematic. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Quantification In Situ of Crystalline Cholesterol and Calcium Phosphate Hydroxyapatite in Human Atherosclerotic Plaques by Solid-State Magic Angle Spinning NMR

    PubMed Central

    Guo, Wen; Morrisett, Joel D.; DeBakey, Michael E.; Lawrie, Gerald M.; Hamilton, James A.

    2010-01-01

    Because of renewed interest in the progression, stabilization, and regression of atherosclerotic plaques, it has become important to develop methods for characterizing structural features of plaques in situ and noninvasively. We present a nondestructive method for ex vivo quantification of 2 solid-phase components of plaques: crystalline cholesterol and calcium phosphate salts. Magic angle spinning (MAS) nuclear magnetic resonance (NMR) spectra of human carotid endarterectomy plaques revealed 13C resonances of crystalline cholesterol monohydrate and a 31P resonance of calcium phosphate hydroxyapatite (CPH). The spectra were obtained under conditions in which there was little or no interference from other chemical components and were suitable for quantification in situ of the crystalline cholesterol and CPH. Carotid atherosclerotic plaques showed a wide variation in their crystalline cholesterol content. The calculated molar ratio of liquid-crystalline cholesterol to phospholipid ranged from 1.1 to 1.7, demonstrating different capabilities of the phospholipids to reduce crystallization of cholesterol. The spectral properties of the phosphate groups in CPH in carotid plaques were identical to those of CPH in bone. 31P MAS NMR is a simple, rapid method for quantification of calcium phosphate salts in tissue without extraction and time-consuming chemical analysis. Crystalline phases in intact atherosclerotic plaques (ex vivo) can be quantified accurately by solid-state 13C and 31PMAS NMR spectroscopy. PMID:10845882

  1. Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.

    PubMed

    De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik

    2018-01-01

    Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.

  2. Reproducibility of Lobar Perfusion and Ventilation Quantification Using SPECT/CT Segmentation Software in Lung Cancer Patients.

    PubMed

    Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin

    2017-09-01

    Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  3. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    PubMed

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-03

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  4. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perko, Z.; Gilli, L.; Lathouwers, D.

    2013-07-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used techniquemore » proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)« less

  5. Quantification of 235U and 238U activity concentrations for undeclared nuclear materials by a digital gamma-gamma coincidence spectroscopy.

    PubMed

    Zhang, Weihua; Yi, Jing; Mekarski, Pawel; Ungar, Kurt; Hauck, Barry; Kramer, Gary H

    2011-06-01

    The purpose of this study is to investigate the possibility of verifying depleted uranium (DU), natural uranium (NU), low enriched uranium (LEU) and high enriched uranium (HEU) by a developed digital gamma-gamma coincidence spectroscopy. The spectroscopy consists of two NaI(Tl) scintillators and XIA LLC Digital Gamma Finder (DGF)/Pixie-4 software and card package. The results demonstrate that the spectroscopy provides an effective method of (235)U and (238)U quantification based on the count rate of their gamma-gamma coincidence counting signatures. The main advantages of this approach over the conventional gamma spectrometry include the facts of low background continuum near coincident signatures of (235)U and (238)U, less interference from other radionuclides by the gamma-gamma coincidence counting, and region-of-interest (ROI) imagine analysis for uranium enrichment determination. Compared to conventional gamma spectrometry, the method offers additional advantage of requiring minimal calibrations for (235)U and (238)U quantification at different sample geometries. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  6. Interim reliability-evaluation program: analysis of the Browns Ferry, Unit 1, nuclear plant. Appendix C - sequence quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1982-07-01

    This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.

  7. Determination of boron in uranium aluminum silicon alloy by spectrophotometry and estimation of expanded uncertainty in measurement

    NASA Astrophysics Data System (ADS)

    Ramanjaneyulu, P. S.; Sayi, Y. S.; Ramakumar, K. L.

    2008-08-01

    Quantification of boron in diverse materials of relevance in nuclear technology is essential in view of its high thermal neutron absorption cross section. A simple and sensitive method has been developed for the determination of boron in uranium-aluminum-silicon alloy, based on leaching of boron with 6 M HCl and H 2O 2, its selective separation by solvent extraction with 2-ethyl hexane 1,3-diol and quantification by spectrophotometry using curcumin. The method has been evaluated by standard addition method and validated by inductively coupled plasma-atomic emission spectroscopy. Relative standard deviation and absolute detection limit of the method are 3.0% (at 1 σ level) and 12 ng, respectively. All possible sources of uncertainties in the methodology have been individually assessed, following the International Organization for Standardization guidelines. The combined uncertainty is calculated employing uncertainty propagation formulae. The expanded uncertainty in the measurement at 95% confidence level (coverage factor 2) is 8.840%.

  8. QconCATs: design and expression of concatenated protein standards for multiplexed protein quantification.

    PubMed

    Simpson, Deborah M; Beynon, Robert J

    2012-09-01

    Systems biology requires knowledge of the absolute amounts of proteins in order to model biological processes and simulate the effects of changes in specific model parameters. Quantification concatamers (QconCATs) are established as a method to provide multiplexed absolute peptide standards for a set of target proteins in isotope dilution standard experiments. Two or more quantotypic peptides representing each of the target proteins are concatenated into a designer gene that is metabolically labelled with stable isotopes in Escherichia coli or other cellular or cell-free systems. Co-digestion of a known amount of QconCAT with the target proteins generates a set of labelled reference peptide standards for the unlabelled analyte counterparts, and by using an appropriate mass spectrometry platform, comparison of the intensities of the peptide ratios delivers absolute quantification of the encoded peptides and in turn the target proteins for which they are surrogates. In this review, we discuss the criteria and difficulties associated with surrogate peptide selection and provide examples in the design of QconCATs for quantification of the proteins of the nuclear factor κB pathway.

  9. Nuclear dynamics of radiation-induced foci in euchromatin and heterochromatin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiolo, Irene; Tang, Jonathan; Georgescu, Walter

    2013-10-01

    Repair of double strand breaks (DSBs) is essential for cell survival and genome integrity. While much is known about the molecular mechanisms involved in DSB repair and checkpoint activation, the roles of nuclear dynamics of radiation-induced foci (RIF) in DNA repair are just beginning to emerge. Here, we summarize results from recent studies that point to distinct features of these dynamics in two different chromatin environments: heterochromatin and euchromatin. We also discuss how nuclear architecture and chromatin components might control these dynamics, and the need of novel quantification methods for a better description and interpretation of these phenomena. These studiesmore » are expected to provide new biomarkers for radiation risk and new strategies for cancer detection and treatment.« less

  10. New Trends in Nuclear Data Research for Medical Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qaim, S. M.

    2005-05-24

    Nuclear methods play an important role in medicine, both in diagnosis and therapy. The status of nuclear data with regard to the production of commonly used diagnostic and therapeutic radionuclides is discussed. The new trends and data needs in the development of potentially useful radionuclides are outlined. They pertain to longer-lived positron emitters (e.g., 64Cu, 76Br, 124I) for studying slow metabolic processes, to positron emitters needed for quantification purposes (94mTc, 86Y, etc.) and to soft radiation emitting therapeutic radionuclides (103Pd, 186Re, 225Ac, etc.). The imaging problems with new positron emitters are discussed. As regards radiation therapy, data on the formationmore » of short-lived activation products in proton therapy are reported.« less

  11. New Trends in Nuclear Data Research for Medical Applications

    NASA Astrophysics Data System (ADS)

    Qaim, S. M.

    2005-05-01

    Nuclear methods play an important role in medicine, both in diagnosis and therapy. The status of nuclear data with regard to the production of commonly used diagnostic and therapeutic radionuclides is discussed. The new trends and data needs in the development of potentially useful radionuclides are outlined. They pertain to longer-lived positron emitters (e.g., 64Cu, 76Br, 124I) for studying slow metabolic processes, to positron emitters needed for quantification purposes (94mTc, 86Y, etc.) and to soft radiation emitting therapeutic radionuclides (103Pd, 186Re, 225Ac, etc.). The imaging problems with new positron emitters are discussed. As regards radiation therapy, data on the formation of short-lived activation products in proton therapy are reported.

  12. Assessment of a 1H high-resolution magic angle spinning NMR spectroscopy procedure for free sugars quantification in intact plant tissue.

    PubMed

    Delgado-Goñi, Teresa; Campo, Sonia; Martín-Sitjar, Juana; Cabañas, Miquel E; San Segundo, Blanca; Arús, Carles

    2013-08-01

    In most plants, sucrose is the primary product of photosynthesis, the transport form of assimilated carbon, and also one of the main factors determining sweetness in fresh fruits. Traditional methods for sugar quantification (mainly sucrose, glucose and fructose) require obtaining crude plant extracts, which sometimes involve substantial sample manipulation, making the process time-consuming and increasing the risk of sample degradation. Here, we describe and validate a fast method to determine sugar content in intact plant tissue by using high-resolution magic angle spinning nuclear magnetic resonance spectroscopy (HR-MAS NMR). The HR-MAS NMR method was used for quantifying sucrose, glucose and fructose in mesocarp tissues from melon fruits (Cucumis melo var. reticulatus and Cucumis melo var. cantalupensis). The resulting sugar content varied among individual melons, ranging from 1.4 to 7.3 g of sucrose, 0.4-2.5 g of glucose; and 0.73-2.83 g of fructose (values per 100 g fw). These values were in agreement with those described in the literature for melon fruit tissue, and no significant differences were found when comparing them with those obtained using the traditional, enzymatic procedure, on melon tissue extracts. The HR-MAS NMR method offers a fast (usually <30 min) and sensitive method for sugar quantification in intact plant tissues, it requires a small amount of tissue (typically 50 mg fw) and avoids the interferences and risks associated with obtaining plant extracts. Furthermore, this method might also allow the quantification of additional metabolites detectable in the plant tissue NMR spectrum.

  13. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  14. Stochastic approach for radionuclides quantification

    NASA Astrophysics Data System (ADS)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  15. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  16. From cutting-edge pointwise cross-section to groupwise reaction rate: A primer

    NASA Astrophysics Data System (ADS)

    Sublet, Jean-Christophe; Fleming, Michael; Gilbert, Mark R.

    2017-09-01

    The nuclear research and development community has a history of using both integral and differential experiments to support accurate lattice-reactor, nuclear reactor criticality and shielding simulations, as well as verification and validation efforts of cross sections and emitted particle spectra. An important aspect to this type of analysis is the proper consideration of the contribution of the neutron spectrum in its entirety, with correct propagation of uncertainties and standard deviations derived from Monte Carlo simulations, to the local and total uncertainty in the simulated reactions rates (RRs), which usually only apply to one application at a time. This paper identifies deficiencies in the traditional treatment, and discusses correct handling of the RR uncertainty quantification and propagation, including details of the cross section components in the RR uncertainty estimates, which are verified for relevant applications. The methodology that rigorously captures the spectral shift and cross section contributions to the uncertainty in the RR are discussed with quantified examples that demonstrate the importance of the proper treatment of the spectrum profile and cross section contributions to the uncertainty in the RR and subsequent response functions. The recently developed inventory code FISPACT-II, when connected to the processed nuclear data libraries TENDL-2015, ENDF/B-VII.1, JENDL-4.0u or JEFF-3.2, forms an enhanced multi-physics platform providing a wide variety of advanced simulation methods for modelling activation, transmutation, burnup protocols and simulating radiation damage sources terms. The system has extended cutting-edge nuclear data forms, uncertainty quantification and propagation methods, which have been the subject of recent integral and differential, fission, fusion and accelerators validation efforts. The simulation system is used to accurately and predictively probe, understand and underpin a modern and sustainable understanding of the nuclear physics that is so important for many areas of science and technology; advanced fission and fuel systems, magnetic and inertial confinement fusion, high energy, accelerator physics, medical application, isotope production, earth exploration, astrophysics and homeland security.

  17. Neutron Resonance Densitometry for Particle-like Debris of Melted Fuel

    NASA Astrophysics Data System (ADS)

    Harada, H.; Kitatani, F.; Koizumi, M.; Takamine, J.; Kureta, M.; Tsutiya, H.; Iimura, H.; Seya, M.; Becker, B.; Kopecky, S.; Schillebeeckx, P.

    2014-04-01

    Neutron Resonance Densitometry (NRD) is proposed for the quantification of nuclear materials in particle-like debris of melted fuel from the reactors of the Fukushima Daiichi nuclear power plant. The method is based on a combination of neutron resonance transmission analysis (NRTA) and neutron resonance capture analysis (NRCA). It uses the neutron time-of-flight (TOF) technique with a pulsed white neutron source and a neutron flight path as short as 5 m. The spectrometer for NRCA is made of LaBr3(Ce) detectors. The achievable uncertainty due to only counting statistics is less than 1 % to determine Pu and U isotopes.

  18. Image Analysis Algorithms for Immunohistochemical Assessment of Cell Death Events and Fibrosis in Tissue Sections

    PubMed Central

    Krajewska, Maryla; Smith, Layton H.; Rong, Juan; Huang, Xianshu; Hyer, Marc L.; Zeps, Nikolajs; Iacopetta, Barry; Linke, Steven P.; Olson, Allen H.; Reed, John C.; Krajewski, Stan

    2009-01-01

    Cell death is of broad physiological and pathological importance, making quantification of biochemical events associated with cell demise a high priority for experimental pathology. Fibrosis is a common consequence of tissue injury involving necrotic cell death. Using tissue specimens from experimental mouse models of traumatic brain injury, cardiac fibrosis, and cancer, as well as human tumor specimens assembled in tissue microarray (TMA) format, we undertook computer-assisted quantification of specific immunohistochemical and histological parameters that characterize processes associated with cell death. In this study, we demonstrated the utility of image analysis algorithms for color deconvolution, colocalization, and nuclear morphometry to characterize cell death events in tissue specimens: (a) subjected to immunostaining for detecting cleaved caspase-3, cleaved poly(ADP-ribose)-polymerase, cleaved lamin-A, phosphorylated histone H2AX, and Bcl-2; (b) analyzed by terminal deoxyribonucleotidyl transferase–mediated dUTP nick end labeling assay to detect DNA fragmentation; and (c) evaluated with Masson's trichrome staining. We developed novel algorithm-based scoring methods and validated them using TMAs as a high-throughput format. The proposed computer-assisted scoring methods for digital images by brightfield microscopy permit linear quantification of immunohistochemical and histochemical stainings. Examples are provided of digital image analysis performed in automated or semiautomated fashion for successful quantification of molecular events associated with cell death in tissue sections. (J Histochem Cytochem 57:649–663, 2009) PMID:19289554

  19. Validation of a quantitative NMR method for suspected counterfeit products exemplified on determination of benzethonium chloride in grapefruit seed extracts.

    PubMed

    Bekiroglu, Somer; Myrberg, Olle; Ostman, Kristina; Ek, Marianne; Arvidsson, Torbjörn; Rundlöf, Torgny; Hakkarainen, Birgit

    2008-08-05

    A 1H-nuclear magnetic resonance (NMR) spectroscopy method for quantitative determination of benzethonium chloride (BTC) as a constituent of grapefruit seed extract was developed. The method was validated, assessing its specificity, linearity, range, and precision, as well as accuracy, limit of quantification and robustness. The method includes quantification using an internal reference standard, 1,3,5-trimethoxybenzene, and regarded as simple, rapid, and easy to implement. A commercial grapefruit seed extract was studied and the experiments were performed on spectrometers operating at two different fields, 300 and 600 MHz for proton frequencies, the former with a broad band (BB) probe and the latter equipped with both a BB probe and a CryoProbe. The concentration average for the product sample was 78.0, 77.8 and 78.4 mg/ml using the 300 BB probe, the 600MHz BB probe and CryoProbe, respectively. The standard deviation and relative standard deviation (R.S.D., in parenthesis) for the average concentrations was 0.2 (0.3%), 0.3 (0.4%) and 0.3mg/ml (0.4%), respectively.

  20. PET Quantification of the Norepinephrine Transporter in Human Brain with (S,S)-18F-FMeNER-D2.

    PubMed

    Moriguchi, Sho; Kimura, Yasuyuki; Ichise, Masanori; Arakawa, Ryosuke; Takano, Harumasa; Seki, Chie; Ikoma, Yoko; Takahata, Keisuke; Nagashima, Tomohisa; Yamada, Makiko; Mimura, Masaru; Suhara, Tetsuya

    2017-07-01

    Norepinephrine transporter (NET) in the brain plays important roles in human cognition and the pathophysiology of psychiatric disorders. Two radioligands, ( S , S )- 11 C-MRB and ( S , S )- 18 F-FMeNER-D 2 , have been used for imaging NETs in the thalamus and midbrain (including locus coeruleus) using PET in humans. However, NET density in the equally important cerebral cortex has not been well quantified because of unfavorable kinetics with ( S , S )- 11 C-MRB and defluorination with ( S , S )- 18 F-FMeNER-D 2 , which can complicate NET quantification in the cerebral cortex adjacent to the skull containing defluorinated 18 F radioactivity. In this study, we have established analysis methods of quantification of NET density in the brain including the cerebral cortex using ( S , S )- 18 F-FMeNER-D 2 PET. Methods: We analyzed our previous ( S , S )- 18 F-FMeNER-D 2 PET data of 10 healthy volunteers dynamically acquired for 240 min with arterial blood sampling. The effects of defluorination on the NET quantification in the superficial cerebral cortex was evaluated by establishing a time stability of NET density estimations with an arterial input 2-tissue-compartment model, which guided the less-invasive reference tissue model and area under the time-activity curve methods to accurately quantify NET density in all brain regions including the cerebral cortex. Results: Defluorination of ( S , S )- 18 F-FMeNER-D 2 became prominent toward the latter half of the 240-min scan. Total distribution volumes in the superficial cerebral cortex increased with the scan duration beyond 120 min. We verified that 90-min dynamic scans provided a sufficient amount of data for quantification of NET density unaffected by defluorination. Reference tissue model binding potential values from the 90-min scan data and area under the time-activity curve ratios of 70- to 90-min data allowed for the accurate quantification of NET density in the cerebral cortex. Conclusion: We have established methods of quantification of NET densities in the brain including the cerebral cortex unaffected by defluorination using ( S , S )- 18 F-FMeNER-D 2 These results suggest that we can accurately quantify NET density with a 90-min ( S , S )- 18 F-FMeNER-D 2 scan in broad brain areas. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  1. Evaluation of Neutron-induced Cross Sections and their Related Covariances with Physical Constraints

    NASA Astrophysics Data System (ADS)

    De Saint Jean, C.; Archier, P.; Privas, E.; Noguère, G.; Habert, B.; Tamagno, P.

    2018-02-01

    Nuclear data, along with numerical methods and the associated calculation schemes, continue to play a key role in reactor design, reactor core operating parameters calculations, fuel cycle management and criticality safety calculations. Due to the intensive use of Monte-Carlo calculations reducing numerical biases, the final accuracy of neutronic calculations increasingly depends on the quality of nuclear data used. This paper gives a broad picture of all ingredients treated by nuclear data evaluators during their analyses. After giving an introduction to nuclear data evaluation, we present implications of using the Bayesian inference to obtain evaluated cross sections and related uncertainties. In particular, a focus is made on systematic uncertainties appearing in the analysis of differential measurements as well as advantages and drawbacks one may encounter by analyzing integral experiments. The evaluation work is in general done independently in the resonance and in the continuum energy ranges giving rise to inconsistencies in evaluated files. For future evaluations on the whole energy range, we call attention to two innovative methods used to analyze several nuclear reaction models and impose constraints. Finally, we discuss suggestions for possible improvements in the evaluation process to master the quantification of uncertainties. These are associated with experiments (microscopic and integral), nuclear reaction theories and the Bayesian inference.

  2. Development of validated high-performance thin layer chromatography for quantification of aristolochic acid in different species of the Aristolochiaceae family.

    PubMed

    Agrawal, Poonam; Laddha, Kirti

    2017-04-01

    This study was undertaken to isolate and quantify aristolochic acid in Aristolochia indica stem and Apama siliquosa root. Aristolochic acid is an important biomarker component present in the Aristolochiaceae family. The isolation method involved simple solvent extraction, precipitation and further purification, using recrystallization. The structure of the compound was confirmed using infrared spectroscopy, mass spectrometry and nuclear magnetic resonance. A specific and rapid high-performance thin layer chromatography (HPTLC) method was developed for analysis of aristolochic acid. The method involved separation on the silica gel 60 F 254 plates using the single solvent system of n-hexane: chloroform: methanol. The method showed good linear relationship in the range 0.4-2.0 μg/spot with r 2  = 0.998. The limit of detection and limit of quantification were 62.841 ng/spot and 209.47 ng/spot, respectively. The proposed validated HPTLC method was found to be an easy to use, accurate and convenient method that could be successfully used for standardization and quality assessment of herbal material as well as formulations containing different species of the Aristolochiaceae family. Copyright © 2016. Published by Elsevier B.V.

  3. A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy

    NASA Astrophysics Data System (ADS)

    Bennun, Leonardo

    2017-07-01

    A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied

  4. Quantification of aquifer properties with surface nuclear magnetic resonance in the Platte River valley, central Nebraska, using a novel inversion method

    USGS Publications Warehouse

    Irons, Trevor P.; Hobza, Christopher M.; Steele, Gregory V.; Abraham, Jared D.; Cannia, James C.; Woodward, Duane D.

    2012-01-01

    Surface nuclear magnetic resonance, a noninvasive geophysical method, measures a signal directly related to the amount of water in the subsurface. This allows for low-cost quantitative estimates of hydraulic parameters. In practice, however, additional factors influence the signal, complicating interpretation. The U.S. Geological Survey, in cooperation with the Central Platte Natural Resources District, evaluated whether hydraulic parameters derived from surface nuclear magnetic resonance data could provide valuable input into groundwater models used for evaluating water-management practices. Two calibration sites in Dawson County, Nebraska, were chosen based on previous detailed hydrogeologic and geophysical investigations. At both sites, surface nuclear magnetic resonance data were collected, and derived parameters were compared with results from four constant-discharge aquifer tests previously conducted at those same sites. Additionally, borehole electromagnetic-induction flowmeter data were analyzed as a less-expensive surrogate for traditional aquifer tests. Building on recent work, a novel surface nuclear magnetic resonance modeling and inversion method was developed that incorporates electrical conductivity and effects due to magnetic-field inhomogeneities, both of which can have a substantial impact on the data. After comparing surface nuclear magnetic resonance inversions at the two calibration sites, the nuclear magnetic-resonance-derived parameters were compared with previously performed aquifer tests in the Central Platte Natural Resources District. This comparison served as a blind test for the developed method. The nuclear magnetic-resonance-derived aquifer parameters were in agreement with results of aquifer tests where the environmental noise allowed data collection and the aquifer test zones overlapped with the surface nuclear magnetic resonance testing. In some cases, the previously performed aquifer tests were not designed fully to characterize the aquifer, and the surface nuclear magnetic resonance was able to provide missing data. In favorable locations, surface nuclear magnetic resonance is able to provide valuable noninvasive information about aquifer parameters and should be a useful tool for groundwater managers in Nebraska.

  5. Solid-state evaluation and polymorphic quantification of venlafaxine hydrochloride raw materials using the Rietveld method.

    PubMed

    Bernardi, Larissa S; Ferreira, Fábio F; Cuffini, Silvia L; Campos, Carlos E M; Monti, Gustavo A; Kuminek, Gislaine; Oliveira, Paulo R; Cardoso, Simone G

    2013-12-15

    Venlafaxine hydrochloride (VEN) is an antidepressant drug widely used for the treatment of depression. The purpose of this study was to carry out the preparation and solid state characterization of the pure polymorphs (Forms 1 and 2) and the polymorphic identification and quantification of four commercially-available VEN raw materials. These two polymorphic forms were obtained from different crystallization methods and characterized by X-ray Powder Diffraction (XRPD), Diffuse Reflectance Infrared Fourier Transform (DRIFT), Raman Spectroscopy (RS), liquid and solid state Nuclear Magnetic Resonance (NMR and ssNMR) spectroscopies, Differential Scanning Calorimetry (DSC), and Scanning Electron Microscopy (SEM) techniques. The main differences were observed by DSC and XRPD and the latter was chosen as the standard technique for the identification and quantification studies in combination with the Rietveld method for the commercial raw materials (VEN1-VEN4) acquired from different manufacturers. Additionally Form 1 and Form 2 can be clearly distinguished from their (13)C ssNMR spectra. Through the analysis, it was possible to conclude that VEN1 and VEN2 were composed only of Form 1, while VEN3 and VEN4 were a mixture of Forms 1 and 2. Additionally, the Rietveld refinement was successfully applied to quantify the polymorphic ratio for VEN3 and VEN4. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Probing cytoskeletal pre-stress and nuclear mechanics in endothelial cells with spatiotemporally controlled (de-)adhesion kinetics on micropatterned substrates

    PubMed Central

    Versaevel, Marie; Riaz, Maryam; Corne, Tobias; Grevesse, Thomas; Lantoine, Joséphine; Mohammed, Danahe; Bruyère, Céline; Alaimo, Laura; De Vos, Winnok H.; Gabriele, Sylvain

    2017-01-01

    ABSTRACT The mechanical properties of living cells reflect their propensity to migrate and respond to external forces. Both cellular and nuclear stiffnesses are strongly influenced by the rigidity of the extracellular matrix (ECM) through reorganization of the cyto- and nucleoskeletal protein connections. Changes in this architectural continuum affect cell mechanics and underlie many pathological conditions. In this context, an accurate and combined quantification of the mechanical properties of both cells and nuclei can contribute to a better understanding of cellular (dys-)function. To address this challenge, we have established a robust method for probing cellular and nuclear deformation during spreading and detachment from micropatterned substrates. We show that (de-)adhesion kinetics of endothelial cells are modulated by substrate stiffness and rely on the actomyosin network. We combined this approach with measurements of cell stiffness by magnetic tweezers to show that relaxation dynamics can be considered as a reliable parameter of cellular pre-stress in adherent cells. During the adhesion stage, large cellular and nuclear deformations occur over a long time span (>60 min). Conversely, nuclear deformation and condensed chromatin are relaxed in a few seconds after detachment. Finally, our results show that accumulation of farnesylated prelamin leads to modifications of the nuclear viscoelastic properties, as reflected by increased nuclear relaxation times. Our method offers an original and non-intrusive way of simultaneously gauging cellular and nuclear mechanics, which can be extended to high-throughput screens of pathological conditions and potential countermeasures. PMID:27111836

  7. Probing cytoskeletal pre-stress and nuclear mechanics in endothelial cells with spatiotemporally controlled (de-)adhesion kinetics on micropatterned substrates.

    PubMed

    Versaevel, Marie; Riaz, Maryam; Corne, Tobias; Grevesse, Thomas; Lantoine, Joséphine; Mohammed, Danahe; Bruyère, Céline; Alaimo, Laura; De Vos, Winnok H; Gabriele, Sylvain

    2017-01-02

    The mechanical properties of living cells reflect their propensity to migrate and respond to external forces. Both cellular and nuclear stiffnesses are strongly influenced by the rigidity of the extracellular matrix (ECM) through reorganization of the cyto- and nucleoskeletal protein connections. Changes in this architectural continuum affect cell mechanics and underlie many pathological conditions. In this context, an accurate and combined quantification of the mechanical properties of both cells and nuclei can contribute to a better understanding of cellular (dys-)function. To address this challenge, we have established a robust method for probing cellular and nuclear deformation during spreading and detachment from micropatterned substrates. We show that (de-)adhesion kinetics of endothelial cells are modulated by substrate stiffness and rely on the actomyosin network. We combined this approach with measurements of cell stiffness by magnetic tweezers to show that relaxation dynamics can be considered as a reliable parameter of cellular pre-stress in adherent cells. During the adhesion stage, large cellular and nuclear deformations occur over a long time span (>60 min). Conversely, nuclear deformation and condensed chromatin are relaxed in a few seconds after detachment. Finally, our results show that accumulation of farnesylated prelamin leads to modifications of the nuclear viscoelastic properties, as reflected by increased nuclear relaxation times. Our method offers an original and non-intrusive way of simultaneously gauging cellular and nuclear mechanics, which can be extended to high-throughput screens of pathological conditions and potential countermeasures.

  8. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    PubMed

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  9. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS

    PubMed Central

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698

  10. Digital pathology: elementary, rapid and reliable automated image analysis.

    PubMed

    Bouzin, Caroline; Saini, Monika L; Khaing, Kyi-Kyi; Ambroise, Jérôme; Marbaix, Etienne; Grégoire, Vincent; Bol, Vanesa

    2016-05-01

    Slide digitalization has brought pathology to a new era, including powerful image analysis possibilities. However, while being a powerful prognostic tool, immunostaining automated analysis on digital images is still not implemented worldwide in routine clinical practice. Digitalized biopsy sections from two independent cohorts of patients, immunostained for membrane or nuclear markers, were quantified with two automated methods. The first was based on stained cell counting through tissue segmentation, while the second relied upon stained area proportion within tissue sections. Different steps of image preparation, such as automated tissue detection, folds exclusion and scanning magnification, were also assessed and validated. Quantification of either stained cells or the stained area was found to be correlated highly for all tested markers. Both methods were also correlated with visual scoring performed by a pathologist. For an equivalent reliability, quantification of the stained area is, however, faster and easier to fine-tune and is therefore more compatible with time constraints for prognosis. This work provides an incentive for the implementation of automated immunostaining analysis with a stained area method in routine laboratory practice. © 2015 John Wiley & Sons Ltd.

  11. Fifty Years of THERP and Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring

    2012-06-01

    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø Nationalmore » Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.« less

  12. Philosophy of ATHEANA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bley, D.C.; Cooper, S.E.; Forester, J.A.

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  13. Extension of the International Atomic Energy Agency phantom study in image quantification: results of multicentre evaluation in Croatia.

    PubMed

    Grošev, Darko; Gregov, Marin; Wolfl, Miroslava Radić; Krstonošić, Branislav; Debeljuh, Dea Dundara

    2018-06-07

    To make quantitative methods of nuclear medicine more available, four centres in Croatia participated in the national intercomparison study, following the materials and methods used in the previous international study organized by the International Atomic Energy Agency (IAEA). The study task was to calculate the activities of four Ba sources (T1/2=10.54 years; Eγ=356 keV) using planar and single-photon emission computed tomography (SPECT) or SPECT/CT acquisitions of the sources inside a water-filled cylindrical phantom. The sources were previously calibrated by the US National Institute of Standards and Technology. Triple-energy window was utilized for scatter correction. Planar studies were corrected for attenuation correction (AC) using the conjugate-view method. For SPECT/CT studies, data from X-ray computed tomography were used for attenuation correction (CT-AC), whereas for SPECT-only acquisition, the Chang-AC method was applied. Using the lessons learned from the IAEA study, data were acquired according to the harmonized data acquisition protocol, and the acquired images were then processed using centralized data analysis. The accuracy of the activity quantification was evaluated as the ratio R between the calculated activity and the value obtained from National Institute of Standards and Technology. For planar studies, R=1.06±0.08; for SPECT/CT study using CT-AC, R=1.00±0.08; and for Chang-AC, R=0.89±0.12. The results are in accordance with those obtained within the larger IAEA study and confirm that SPECT/CT method is the most appropriate for accurate activity quantification.

  14. MeV per Nucleon Ion Irradiation of Nuclear Materials with High Energy Synchrotron X-ray Characterization

    DOE PAGES

    Pellin, M. J.; Yacout, Abdellatif M.; Mo, Kun; ...

    2016-01-14

    The combination of MeV/Nucleon ion irradiation (e.g. 133 MeV Xe) and high energy synchrotron x-ray characterization (e.g. at the Argonne Advanced Photon Source, APS) provides a powerful characterization method to understand radiation effects and to rapidly screen materials for the nuclear reactor environment. Ions in this energy range penetrate ~10 μm into materials. Over this range, the physical interactions vary (electronic stopping, nuclear stopping and added interstitials). Spatially specific x-ray (and TEM and nanoindentation) analysis allow individual quantification of these various effects. Hard x-rays provide the penetration depth needed to analyze even nuclear fuels. Here, this combination of synchrotron x-raymore » and MeV/Nucleon ion irradiation is demonstrated on U-Mo fuels. A preliminary look at HT-9 steels is also presented. We suggest that a hard x-ray facility with in situ MeV/nucleon irradiation capability would substantially accelerate the rate of discovery for extreme materials.« less

  15. Demetalation of Fe, Mn, and Cu chelates and complexes: application to the NMR analysis of micronutrient fertilizers.

    PubMed

    López-Rayo, Sandra; Lucena, Juan J; Laghi, Luca; Cremonini, Mauro A

    2011-12-28

    The application of nuclear magnetic resonance (NMR) for the quality control of fertilizers based on Fe(3+), Mn(2+), and Cu(2+) chelates and complexes is precluded by the strong paramagnetism of metals. Recently, a method based on the use of ferrocyanide has been described to remove iron from commercial iron chelates based on the o,o-EDDHA [ethylenediamine-N,N'bis(2-hydroxyphenylacetic)acid] chelating agent for their analysis and quantification by NMR. The present work extended that procedure to other paramagnetic ions, manganese and copper, and other chelating, EDTA (ethylenediaminetetraacetic acid), IDHA [N-(1,2-dicarboxyethyl)-d,l-aspartic acid], and complexing agents, gluconate and heptagluconate. Results showed that the removal of the paramagnetic ions was complete, allowing us to obtain (1)H NMR spectra characterized by narrow peaks. The quantification of the ligands by NMR and high-performance liquid chromatography showed that their complete recovery was granted. The NMR analysis enabled detection and quantification of unknown impurities without the need of pure compounds as internal standards.

  16. Highly Effective DNA Extraction Method for Nuclear Short Tandem Repeat Testing of Skeletal Remains from Mass Graves

    PubMed Central

    Davoren, Jon; Vanek, Daniel; Konjhodzić, Rijad; Crews, John; Huffine, Edwin; Parsons, Thomas J.

    2007-01-01

    Aim To quantitatively compare a silica extraction method with a commonly used phenol/chloroform extraction method for DNA analysis of specimens exhumed from mass graves. Methods DNA was extracted from twenty randomly chosen femur samples, using the International Commission on Missing Persons (ICMP) silica method, based on Qiagen Blood Maxi Kit, and compared with the DNA extracted by the standard phenol/chloroform-based method. The efficacy of extraction methods was compared by real time polymerase chain reaction (PCR) to measure DNA quantity and the presence of inhibitors and by amplification with the PowerPlex 16 (PP16) multiplex nuclear short tandem repeat (STR) kit. Results DNA quantification results showed that the silica-based method extracted on average 1.94 ng of DNA per gram of bone (range 0.25-9.58 ng/g), compared with only 0.68 ng/g by the organic method extracted (range 0.0016-4.4880 ng/g). Inhibition tests showed that there were on average significantly lower levels of PCR inhibitors in DNA isolated by the organic method. When amplified with PP16, all samples extracted by silica-based method produced 16 full loci profiles, while only 75% of the DNA extracts obtained by organic technique amplified 16 loci profiles. Conclusions The silica-based extraction method showed better results in nuclear STR typing from degraded bone samples than a commonly used phenol/chloroform method. PMID:17696302

  17. Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method

    DOE PAGES

    Liao, Huafei N.; Groth, Katrina; Stevens-Adams, Susan

    2015-07-29

    Our article documents an exploratory study for collecting and using human performance data to inform human error probability (HEP) estimates for a new human reliability analysis (HRA) method, the IntegrateD Human Event Analysis System (IDHEAS). The method was based on cognitive models and mechanisms underlying human behaviour and employs a framework of 14 crew failure modes (CFMs) to represent human failures typical for human performance in nuclear power plant (NPP) internal, at-power events [1]. A decision tree (DT) was constructed for each CFM to assess the probability of the CFM occurring in different contexts. Data needs for IDHEAS quantification aremore » discussed. Then, the data collection framework and process is described and how the collected data were used to inform HEP estimation is illustrated with two examples. Next, five major technical challenges are identified for leveraging human performance data for IDHEAS quantification. Furthermore, these challenges reflect the data needs specific to IDHEAS. More importantly, they also represent the general issues with current human performance data and can provide insight for a path forward to support HRA data collection, use, and exchange for HRA method development, implementation, and validation.« less

  18. Comparisons of Wilks’ and Monte Carlo Methods in Response to the 10CFR50.46(c) Proposed Rulemaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Szilard, Ronaldo; Zou, Ling

    The Nuclear Regulatory Commission (NRC) is proposing a new rulemaking on emergency core system/loss-of-coolant accident (LOCA) performance analysis. In the proposed rulemaking, designated as 10CFR50.46(c), the US NRC put forward an equivalent cladding oxidation criterion as a function of cladding pre-transient hydrogen content. The proposed rulemaking imposes more restrictive and burnup-dependent cladding embrittlement criteria; consequently nearly all the fuel rods in a reactor core need to be analyzed under LOCA conditions to demonstrate compliance to the safety limits. New analysis methods are required to provide a thorough characterization of the reactor core in order to identify the locations of themore » limiting rods as well as to quantify the safety margins under LOCA conditions. With the new analysis method presented in this work, the limiting transient case and the limiting rods can be easily identified to quantify the safety margins in response to the proposed new rulemaking. In this work, the best-estimate plus uncertainty (BEPU) analysis capability for large break LOCA with the new cladding embrittlement criteria using the RELAP5-3D code is established and demonstrated with a reduced set of uncertainty parameters. Both the direct Monte Carlo method and the Wilks’ nonparametric statistical method can be used to perform uncertainty quantification. Wilks’ method has become the de-facto industry standard to perform uncertainty quantification in BEPU LOCA analyses. Despite its widespread adoption by the industry, the use of small sample sizes to infer statement of compliance to the existing 10CFR50.46 rule, has been a major cause of unrealized operational margin in today’s BEPU methods. Moreover the debate on the proper interpretation of the Wilks’ theorem in the context of safety analyses is not fully resolved yet, even more than two decades after its introduction in the frame of safety analyses in the nuclear industry. This represents both a regulatory and application risk in rolling out new methods. With the 10CFR50.46(c) proposed rulemaking, the deficiencies of the Wilks’ approach are further exacerbated. The direct Monte Carlo approach offers a robust alternative to perform uncertainty quantification within the context of BEPU analyses. In this work, the Monte Carlo method is compared with the Wilks’ method in response to the NRC 10CFR50.46(c) proposed rulemaking.« less

  19. Advanced Quantification of Plutonium Ionization Potential to Support Nuclear Forensic Evaluations by Resonance Ionization Mass Spectrometry

    DTIC Science & Technology

    2015-06-01

    Research Committee nm Nanometer Np Neptunium NPT Treaty of Non-proliferation of Nuclear Weapons ns Nanosecond ps Picosecond Pu Plutonium RIMS...discovery—credited also to Fritz Strassman— scientists realized these reactions also emitted secondary neutrons . These secondary neutrons could in...destructive capabilities of nuclear fission and atomic weapons . Figure 1. Uranium-235 Fission chain reaction, from [1

  20. Rapid quantification of viable Legionella in nuclear cooling tower waters using filter cultivation, fluorescent in situ hybridization and solid-phase cytometry.

    PubMed

    Baudart, J; Guillaume, C; Mercier, A; Lebaron, P; Binet, M

    2015-05-01

    To develop a rapid and sensitive method to quantify viable Legionella spp. in cooling tower water samples. A rapid, culture-based method capable of quantifying as few as 600 Legionella microcolonies per litre within 2 days in industrial waters was developed. The method combines a short cultivation step of microcolonies on GVPC agar plate, specific detection of Legionella cells by a fluorescent in situ hybridization (FISH) approach, and a sensitive enumeration using a solid-phase cytometer. Following optimization of the cultivation conditions, the qualitative and quantitative performance of the method was assessed and the method was applied to 262 nuclear power plant cooling water samples. The performance of this method was in accordance with the culture method (NF-T 90-431) for Legionella enumeration. The rapid detection of viable Legionella in water is a major concern to the effective monitoring of this pathogenic bacterium in the main water sources involved in the transmission of legionellosis infection (Legionnaires' disease). The new method proposed here appears to be a robust, efficient and innovative means for rapidly quantifying cultivable Legionella in cooling tower water samples within 48 h. © 2015 The Society for Applied Microbiology.

  1. Influence of the colloidal structure of dairy gels on milk fat fusion behavior: quantification of the liquid fat content by in situ quantitative proton nuclear magnetic resonance spectroscopy (isq (1) H NMR).

    PubMed

    Bouteille, Romain; Perez, Jeanne; Khifer, Farid; Jouan-Rimbaud-Bouveresse, Delphine; Lecanu, Bruno; This, Hervé

    2013-04-01

    Dairy gels (DG), such as yoghurts, contain both solid and liquid fats at the time of consumption, as their temperature rises to anything between 10 and 24 °C after being introduced into the mouth at 4 °C. The mass ratio between solid and liquid fats, which depends on the temperature, impacts the organoleptic properties of DG. As the ordinary methods for determining this ratio can only be applied to samples consisting mainly in fat materials, a fat extraction step needs to be added into the analytical process when applied to DG, which prevents the study of the potential impact of their colloidal structure on milk fat fusion behavior. In situ quantitative proton nuclear magnetic resonance spectroscopy (isq (1) H NMR) was investigated as a method for direct measurements in DG: at temperatures between 20.0 and 70.0 °C, the liquid fat content and the composition of triacylglycerols of the liquid phase (in terms of alkyl chains length) were determined. Spectra of isolated milk fat also enable the quantification of the double bonds of triacylglycerols. Statistical tests showed no significant difference between isolated milk fat and milk fat inside a DG in terms of melting behavior: the fat globule membrane does not seem to have a significant influence on the fat melting behavior. © 2013 Institute of Food Technologists®

  2. Assessment of 1H NMR-based metabolomics analysis for normalization of urinary metals against creatinine.

    PubMed

    Cassiède, Marc; Nair, Sindhu; Dueck, Meghan; Mino, James; McKay, Ryan; Mercier, Pascal; Quémerais, Bernadette; Lacy, Paige

    2017-01-01

    Proton nuclear magnetic resonance ( 1 H NMR, or NMR) spectroscopy and inductively coupled plasma-mass spectrometry (ICP-MS) are commonly used for metabolomics and metal analysis in urine samples. However, creatinine quantification by NMR for the purpose of normalization of urinary metals has not been validated. We assessed the validity of using NMR analysis for creatinine quantification in human urine samples in order to allow normalization of urinary metal concentrations. NMR and ICP-MS techniques were used to measure metabolite and metal concentrations in urine samples from 10 healthy subjects. For metabolite analysis, two magnetic field strengths (600 and 700MHz) were utilized. In addition, creatinine concentrations were determined by using the Jaffe method. Creatinine levels were strongly correlated (R 2 =0.99) between NMR and Jaffe methods. The NMR spectra were deconvoluted with a target database containing 151 metabolites that are present in urine. A total of 50 metabolites showed good correlation (R 2 =0.7-1.0) at 600 and 700MHz. Metal concentrations determined after NMR-measured creatinine normalization were comparable to previous reports. NMR analysis provided robust urinary creatinine quantification, and was sufficient for normalization of urinary metal concentrations. We found that NMR-measured creatinine-normalized urinary metal concentrations in our control subjects were similar to general population levels in Canada and the United Kingdom. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Age of heart disease presentation and dysmorphic nuclei in patients with LMNA mutations.

    PubMed

    Core, Jason Q; Mehrabi, Mehrsa; Robinson, Zachery R; Ochs, Alexander R; McCarthy, Linda A; Zaragoza, Michael V; Grosberg, Anna

    2017-01-01

    Nuclear shape defects are a distinguishing characteristic in laminopathies, cancers, and other pathologies. Correlating these defects to the symptoms, mechanisms, and progression of disease requires unbiased, quantitative, and high-throughput means of quantifying nuclear morphology. To accomplish this, we developed a method of automatically segmenting fluorescently stained nuclei in 2D microscopy images and then classifying them as normal or dysmorphic based on three geometric features of the nucleus using a package of Matlab codes. As a test case, cultured skin-fibroblast nuclei of individuals possessing LMNA splice-site mutation (c.357-2A>G), LMNA nonsense mutation (c.736 C>T, pQ246X) in exon 4, LMNA missense mutation (c.1003C>T, pR335W) in exon 6, Hutchinson-Gilford Progeria Syndrome, and no LMNA mutations were analyzed. For each cell type, the percentage of dysmorphic nuclei, and other morphological features such as average nuclear area and average eccentricity were obtained. Compared to blind observers, our procedure implemented in Matlab codes possessed similar accuracy to manual counting of dysmorphic nuclei while being significantly more consistent. The automatic quantification of nuclear defects revealed a correlation between in vitro results and age of patients for initial symptom onset. Our results demonstrate the method's utility in experimental studies of diseases affecting nuclear shape through automated, unbiased, and accurate identification of dysmorphic nuclei.

  4. Radiostrontium accumulation in animal bones: development of a radiochemical method by ultra low-level liquid scintillation counting for its quantification.

    PubMed

    Iammarino, Marco; Dell'Oro, Daniela; Bortone, Nicola; Mangiacotti, Michele; Chiaravalle, Antonio Eugenio

    2018-03-31

    Strontium-90 (90Sr) is a fission product, resulting from the use of uranium and plutonium in nuclear reactors and weapons. Consequently, it may be found in the environment as a consequence of nuclear fallouts, nuclear weapon testing, and not correct waste management. When present in the environment, strontium-90 may be taken into animal body by drinking water, eating food, or breathing air. The primary health effects are bone tumors and tumors of the blood-cell forming organs, due to beta particles emitted by both 90Sr and yttrium-90 (90Y). Moreover, another health concern is represented by inhibition of calcification and bone deformities in animals. Actually, radiometric methods for the determination of 90Sr in animal bones are lacking. This article describers a radiochemical method for the determination of 90Sr in animal bones, by ultra low-level liquid scintillation counting. The method precision and trueness have been demonstrated through validation tests (CV% = 12.4%; mean recovery = 98.4%). Detection limit and decision threshold corresponding to 8 and 3 mBecquerel (Bq) kg-1, respectively, represent another strong point of this analytical procedure. This new radiochemical method permits the selective extraction of 90Sr, without interferences, and it is suitable for radiocontamination surveillance programs, and it is also an improvement with respect to food safety controls.

  5. Linear Array Ultrasonic Test Results from Alkali-Silica Reaction (ASR) Specimens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clayton, Dwight A; Khazanovich, Dr. Lev; Salles, Lucio

    2016-04-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the variousmore » nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations.This report presents results of the ultrasound evaluation of four concrete slabs with varying levels of ASR damage present. This included an investigation of the experimental results, as well as a supplemental simulation considering the effect of ASR damage by elasto-dynamic wave propagation using a finite integration technique method. It was found that the Hilbert Transform Indicator (HTI), developed for quantification of freeze/thaw damage in concrete structures, could also be successfully utilized for quantification of ASR damage. internal microstructure flaws, and reinforcement locations.« less

  6. Entropy based quantification of Ki-67 positive cell images and its evaluation by a reader study

    NASA Astrophysics Data System (ADS)

    Niazi, M. Khalid Khan; Pennell, Michael; Elkins, Camille; Hemminger, Jessica; Jin, Ming; Kirby, Sean; Kurt, Habibe; Miller, Barrie; Plocharczyk, Elizabeth; Roth, Rachel; Ziegler, Rebecca; Shana'ah, Arwa; Racke, Fred; Lozanski, Gerard; Gurcan, Metin N.

    2013-03-01

    Presence of Ki-67, a nuclear protein, is typically used to measure cell proliferation. The quantification of the Ki-67 proliferation index is performed visually by the pathologist; however, this is subject to inter- and intra-reader variability. Automated techniques utilizing digital image analysis by computers have emerged. The large variations in specimen preparation, staining, and imaging as well as true biological heterogeneity of tumor tissue often results in variable intensities in Ki-67 stained images. These variations affect the performance of currently developed methods. To optimize the segmentation of Ki-67 stained cells, one should define a data dependent transformation that will account for these color variations instead of defining a fixed linear transformation to separate different hues. To address these issues in images of tissue stained with Ki-67, we propose a methodology that exploits the intrinsic properties of CIE L∗a∗b∗ color space to translate this complex problem into an automatic entropy based thresholding problem. The developed method was evaluated through two reader studies with pathology residents and expert hematopathologists. Agreement between the proposed method and the expert pathologists was good (CCC = 0.80).

  7. Nuclear RNA quantification in protoplast cell-cycle phases.

    PubMed

    Bergounioux, C; Perennes, C; Brown, S C; Gadal, P

    1988-01-01

    Using acridine orange staining and flow cytometry the DNA and RNA levels (arbitrary units) of individual cells may be established. Here, this method has been applied to nuclei isolated from plant protoplasts during culture. The specificity of the technique has been validated for such plant material; ribonuclease markedly reduced nuclear staining without modifying the DNA histogram; ribonuclease inhibitor prevented the action of released cell nucleases; and protoplasts cultivated with actinomycin D did not synthesize RNA. First RNA synthesis was evident 18 h after Petunia hybrida protoplasts had been put into culture. An increase of RNA above a critical level was required for cells to be able to initiate DNA replication from G1, termed G1B. G2 nuclei had an RNA:DNA ratio similar to that of G1 nuclei.

  8. A proteomic insight into vitellogenesis during tick ovary maturation.

    PubMed

    Xavier, Marina Amaral; Tirloni, Lucas; Pinto, Antônio F M; Diedrich, Jolene K; Yates, John R; Mulenga, Albert; Logullo, Carlos; da Silva Vaz, Itabajara; Seixas, Adriana; Termignoni, Carlos

    2018-03-16

    Ticks are arthropod ectoparasites of importance for public and veterinary health. The understanding of tick oogenesis and embryogenesis could contribute to the development of novel control methods. However, to date, studies on the temporal dynamics of proteins during ovary development were not reported. In the present study we followed protein profile during ovary maturation. Proteomic analysis of ovary extracts was performed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) using shotgun strategy, in addition to dimethyl labelling-based protein quantification. A total of 3,756 proteins were identified, which were functionally annotated into 30 categories. Circa 80% of the annotated proteins belong to categories related to basal metabolism, such as protein synthesis and modification machineries, nuclear regulation, cytoskeleton, proteasome machinery, transcriptional machinery, energetic metabolism, extracellular matrix/cell adhesion, immunity, oxidation/detoxification metabolism, signal transduction, and storage. The abundance of selected proteins involved in yolk uptake and degradation, as well as vitellin accumulation during ovary maturation, was assessed using dimethyl-labelling quantification. In conclusion, proteins identified in this study provide a framework for future studies to elucidate tick development and validate candidate targets for novel control methods.

  9. Magic Angle Spinning NMR Metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhi Hu, Jian

    Nuclear Magnetic Resonance (NMR) spectroscopy is a non-destructive, quantitative, reproducible, untargeted and unbiased method that requires no or minimal sample preparation, and is one of the leading analytical tools for metabonomics research [1-3]. The easy quantification and the no need of prior knowledge about compounds present in a sample associated with NMR are advantageous over other techniques [1,4]. 1H NMR is especially attractive because protons are present in virtually all metabolites and its NMR sensitivity is high, enabling the simultaneous identification and monitoring of a wide range of low molecular weight metabolites.

  10. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  11. Evaluation of phosphorus characterization in broiler ileal digesta, manure, and litter samples: (31)P-NMR vs. HPLC.

    PubMed

    Leytem, A B; Kwanyuen, P; Plumstead, P W; Maguire, R O; Brake, J

    2008-01-01

    Using 31-phosphorus nuclear magnetic resonance spectroscopy ((31)P-NMR) to characterize phosphorus (P) in animal manures and litter has become a popular technique in the area of nutrient management. To date, there has been no published work evaluating P quantification in manure/litter samples with (31)P-NMR compared to other accepted methods such as high performance liquid chromatography (HPLC). To evaluate the use of (31)P-NMR to quantify myo-inositol hexakisphosphate (phytate) in ileal digesta, manure, and litter from broilers, we compared results obtained from both (31)P-NMR and a more traditional HPLC method. The quantification of phytate in all samples was very consistent between the two methods, with linear regressions having slopes ranging from 0.94 to 1.07 and r(2) values of 0.84 to 0.98. We compared the concentration of total monoester P determined with (31)P-NMR with the total inositol P content determined with HPLC and found a strong linear relationship between the two measurements having slopes ranging from 0.91 to 1.08 and r(2) values of 0.73 to 0.95. This suggests that (31)P-NMR is a very reliable method for quantifying P compounds in manure/litter samples.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  13. Computational nuclear quantum many-body problem: The UNEDF project

    NASA Astrophysics Data System (ADS)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  14. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    PubMed

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. COMPARISON OF BIOASSAY AND ENZYME-LINKED IMMUNOSORBENT ASSAY FOR QUANTIFICATION OF 'SPODOPTERA FRUGIPERDA' NUCLEAR POLYHEDROSIS VIRUS IN SOIL

    EPA Science Inventory

    Standard curves with known amounts of Spodoptera frugiperda nuclear polyhedrosis virus (NPV) in soil were established with a bioassay and with an enzyme-linked immunosorbent assay (ELISA). The bioassay detected as few as 4 x 10 to the 4th power polyhedral inclusion bodies (PIB)/g...

  16. STEM VQ Method, Using Scanning Transmission Electron Microscopy (STEM) for Accurate Virus Quantification

    DTIC Science & Technology

    2017-02-02

    Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies

  17. Determination of Oversulphated Chondroitin Sulphate and Dermatan Sulphate in unfractionated heparin by (1)H-NMR - Collaborative study for quantification and analytical determination of LoD.

    PubMed

    McEwen, I; Mulloy, B; Hellwig, E; Kozerski, L; Beyer, T; Holzgrabe, U; Wanko, R; Spieser, J-M; Rodomonte, A

    2008-12-01

    Oversulphated Chondroitin Sulphate (OSCS) and Dermatan Sulphate (DS) in unfractionated heparins can be identified by nuclear magnetic resonance spectrometry (NMR). The limit of detection (LoD) of OSCS is 0.1% relative to the heparin content. This LoD is obtained at a signal-to-noise ratio (S/N) of 2000:1 of the heparin methyl signal. Quantification is best obtained by comparing peak heights of the OSCS and heparin methyl signals. Reproducibility of less than 10% relative standard deviation (RSD) has been obtained. The accuracy of quantification was good.

  18. Integral Full Core Multi-Physics PWR Benchmark with Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forget, Benoit; Smith, Kord; Kumar, Shikhar

    In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less

  19. Improved uncertainty quantification in nondestructive assay for nonproliferation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, Tom; Croft, Stephen; Jarman, Ken

    2016-12-01

    This paper illustrates methods to improve uncertainty quantification (UQ) for non-destructive assay (NDA) measurements used in nuclear nonproliferation. First, it is shown that current bottom-up UQ applied to calibration data is not always adequate, for three main reasons: (1) Because there are errors in both the predictors and the response, calibration involves a ratio of random quantities, and calibration data sets in NDA usually consist of only a modest number of samples (3–10); therefore, asymptotic approximations involving quantities needed for UQ such as means and variances are often not sufficiently accurate; (2) Common practice overlooks that calibration implies a partitioningmore » of total error into random and systematic error, and (3) In many NDA applications, test items exhibit non-negligible departures in physical properties from calibration items, so model-based adjustments are used, but item-specific bias remains in some data. Therefore, improved bottom-up UQ using calibration data should predict the typical magnitude of item-specific bias, and the suggestion is to do so by including sources of item-specific bias in synthetic calibration data that is generated using a combination of modeling and real calibration data. Second, for measurements of the same nuclear material item by both the facility operator and international inspectors, current empirical (top-down) UQ is described for estimating operator and inspector systematic and random error variance components. A Bayesian alternative is introduced that easily accommodates constraints on variance components, and is more robust than current top-down methods to the underlying measurement error distributions.« less

  20. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  1. Advanced Technology and Mitigation (ATDM) SPARC Re-Entry Code Fiscal Year 2017 Progress and Accomplishments for ECP.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crozier, Paul; Howard, Micah; Rider, William J.

    The SPARC (Sandia Parallel Aerodynamics and Reentry Code) will provide nuclear weapon qualification evidence for the random vibration and thermal environments created by re-entry of a warhead into the earth’s atmosphere. SPARC incorporates the innovative approaches of ATDM projects on several fronts including: effective harnessing of heterogeneous compute nodes using Kokkos, exascale-ready parallel scalability through asynchronous multi-tasking, uncertainty quantification through Sacado integration, implementation of state-of-the-art reentry physics and multiscale models, use of advanced verification and validation methods, and enabling of improved workflows for users. SPARC is being developed primarily for the Department of Energy nuclear weapon program, with additional developmentmore » and use of the code is being supported by the Department of Defense for conventional weapons programs.« less

  2. Rapid quantification and sex determination of forensic evidence materials.

    PubMed

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  3. Generic method for the absolute quantification of glutathione S-conjugates: Application to the conjugates of acetaminophen, clozapine and diclofenac.

    PubMed

    den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M

    2017-03-01

    Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.

  4. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    PubMed

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  5. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  6. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    NASA Astrophysics Data System (ADS)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  7. Syn/anti isomerization of 2,4-dinitrophenylhydrazones in the determination of airborne unsymmetrical aldehydes and ketones using 2,4-dinitrophenylhydrazine derivation.

    PubMed

    Binding, N; Müller, W; Witting, U

    1996-10-01

    Aldehydes and ketones readily react with 2,4-dinitrophenylhydrazine (2,4-DNPH) to form the corresponding hydrazones. This reaction has been frequently used for the quantification of airborne carbonyl compounds. Since unsymmetrical aldehydes and ketones are known to form isomeric 2,4-dinitrophenylhydrazones (syn/ anti-isomers), the influence of isomerization on the practicability and accuracy of the 2,4-DNPH-method using 2,4-dinitrophenylhydrazine-coated solid sorbent samplers has been studied with three ketones (methyl ethyl ketone (MEK), methyl isopropyl ketone (MIPK), and methyl isobutyl ketone (MIBK)). With all three ketones the reaction with 2,4-DNPH resulted in mixtures of the isomeric hydrazones which were separated by HPLC and GC and identified by mass spectroscopy and (1)H nuclear magnetic resonance spectroscopy. The isomers show similar chromatographic behaviour in HPLC as well as in GC, thus leading to problems in quantification and interpretation of chromatographic results.

  8. Experimental set up for the irradiation of biological samples and nuclear track detectors with UV C

    PubMed Central

    Portu, Agustina Mariana; Rossini, Andrés Eugenio; Gadan, Mario Alberto; Bernaola, Omar Alberto; Thorp, Silvia Inés; Curotto, Paula; Pozzi, Emiliano César Cayetano; Cabrini, Rómulo Luis; Martin, Gisela Saint

    2016-01-01

    Aim In this work we present a methodology to produce an “imprint” of cells cultivated on a polycarbonate detector by exposure of the detector to UV C radiation. Background The distribution and concentration of 10B atoms in tissue samples coming from BNCT (Boron Neutron Capture Therapy) protocols can be determined through the quantification and analysis of the tracks forming its autoradiography image on a nuclear track detector. The location of boron atoms in the cell structure could be known more accurately by the simultaneous observation of the nuclear tracks and the sample image on the detector. Materials and Methods A UV C irradiator was constructed. The irradiance was measured along the lamp direction and at different distances. Melanoma cells were cultured on polycarbonate foils, incubated with borophenylalanine, irradiated with thermal neutrons and exposed to UV C radiation. The samples were chemically attacked with a KOH solution. Results A uniform irradiation field was established to expose the detector foils to UV C light. Cells could be seeded on the polycarbonate surface. Both imprints from cells and nuclear tracks were obtained after chemical etching. Conclusions It is possible to yield cellular imprints in polycarbonate. The nuclear tracks were mostly present inside the cells, indicating a preferential boron uptake. PMID:26933396

  9. Imaging and quantification of amyloid fibrillation in the cell nucleus.

    PubMed

    Arnhold, Florian; Scharf, Andrea; von Mikecz, Anna

    2015-01-01

    Xenobiotics, as well as intrinsic processes such as cellular aging, contribute to an environment that constantly challenges nuclear organization and function. While it becomes increasingly clear that proteasome-dependent proteolysis is a major player, the topology and molecular mechanisms of nuclear protein homeostasis remain largely unknown. We have shown previously that (1) proteasome-dependent protein degradation is organized in focal microenvironments throughout the nucleoplasm and (2) heavy metals as well as nanoparticles induce nuclear protein fibrillation with amyloid characteristics. Here, we describe methods to characterize the landscape of intranuclear amyloid on the global and local level in different systems such as cultures of mammalian cells and the soil nematode Caenorhabditis elegans. Application of discrete mathematics to imaging data is introduced as a tool to develop pattern recognition of intracellular protein fibrillation. Since stepwise fibrillation of otherwise soluble proteins to insoluble amyloid-like protein aggregates is a hallmark of neurodegenerative protein-misfolding disorders including Alzheimer's disease, CAG repeat diseases, and the prion encephalopathies, investigation of intracellular amyloid may likewise aid to a better understanding of the pathomechanisms involved. We consider aggregate profiling as an important experimental approach to determine if nuclear amyloid has toxic or protective roles in various disease processes.

  10. Bioanalytical methods for determination of tamoxifen and its phase I metabolites: a review.

    PubMed

    Teunissen, S F; Rosing, H; Schinkel, A H; Schellens, J H M; Beijnen, J H

    2010-12-17

    The selective estrogen receptor modulator tamoxifen is used in the treatment of early and advanced breast cancer and in selected cases for breast cancer prevention in high-risk subjects. The cytochrome P450 enzyme system and flavin-containing monooxygenase are responsible for the extensive metabolism of tamoxifen into several phase I metabolites that vary in toxicity and potencies towards estrogen receptor (ER) alpha and ER beta. An extensive overview of publications on the determination of tamoxifen and its phase I metabolites in biological samples is presented. In these publications techniques were used such as capillary electrophoresis, liquid, gas and thin layer chromatography coupled with various detection techniques (mass spectrometry, ultraviolet or fluorescence detection, liquid scintillation counting and nuclear magnetic resonance spectroscopy). A trend is seen towards the use of liquid chromatography coupled to mass spectrometry (LC-MS). State-of-the-art LC-MS equipment allowed for identification of unknown metabolites and quantification of known metabolites reaching lower limit of quantification levels in the sub pg mL(-1) range. Although tamoxifen is also metabolized into phase II metabolites, the number of publications reporting on phase II metabolism of tamoxifen is scarce. Therefore the focus of this review is on phase I metabolites of tamoxifen. We conclude that in the past decades tamoxifen metabolism has been studied extensively and numerous metabolites have been identified. Assays have been developed for both the identification and quantification of tamoxifen and its metabolites in an array of biological samples. This review can be used as a resource for method transfer and development of analytical methods used to support pharmacokinetic and pharmacodynamic studies of tamoxifen and its phase I metabolites. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  12. Uncertainty quantification for discrimination of nuclear events as violations of the comprehensive nuclear-test-ban treaty.

    PubMed

    Sloan, Jamison; Sun, Yunwei; Carrigan, Charles

    2016-05-01

    Enforcement of the Comprehensive Nuclear Test Ban Treaty (CTBT) will involve monitoring for radiologic indicators of underground nuclear explosions (UNEs). A UNE produces a variety of radioisotopes which then decay through connected radionuclide chains. A particular species of interest is xenon, namely the four isotopes (131m)Xe, (133m)Xe, (133)Xe, and (135)Xe. Due to their half lives, some of these isotopes can exist in the subsurface for more than 100 days. This convenient timescale, combined with modern detection capabilities, makes the xenon family a desirable candidate for UNE detection. Ratios of these isotopes as a function of time have been studied in the past for distinguishing nuclear explosions from civilian nuclear applications. However, the initial yields from UNEs have been treated as fixed values. In reality, these independent yields are uncertain to a large degree. This study quantifies the uncertainty in xenon ratios as a result of these uncertain initial conditions to better bound the values that xenon ratios can assume. We have successfully used a combination of analytical and sampling based statistical methods to reliably bound xenon isotopic ratios. We have also conducted a sensitivity analysis and found that xenon isotopic ratios are primarily sensitive to only a few of many uncertain initial conditions. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Nuclear data uncertainty propagation by the XSUSA method in the HELIOS2 lattice code

    NASA Astrophysics Data System (ADS)

    Wemple, Charles; Zwermann, Winfried

    2017-09-01

    Uncertainty quantification has been extensively applied to nuclear criticality analyses for many years and has recently begun to be applied to depletion calculations. However, regulatory bodies worldwide are trending toward requiring such analyses for reactor fuel cycle calculations, which also requires uncertainty propagation for isotopics and nuclear reaction rates. XSUSA is a proven methodology for cross section uncertainty propagation based on random sampling of the nuclear data according to covariance data in multi-group representation; HELIOS2 is a lattice code widely used for commercial and research reactor fuel cycle calculations. This work describes a technique to automatically propagate the nuclear data uncertainties via the XSUSA approach through fuel lattice calculations in HELIOS2. Application of the XSUSA methodology in HELIOS2 presented some unusual challenges because of the highly-processed multi-group cross section data used in commercial lattice codes. Currently, uncertainties based on the SCALE 6.1 covariance data file are being used, but the implementation can be adapted to other covariance data in multi-group structure. Pin-cell and assembly depletion calculations, based on models described in the UAM-LWR Phase I and II benchmarks, are performed and uncertainties in multiplication factor, reaction rates, isotope concentrations, and delayed-neutron data are calculated. With this extension, it will be possible for HELIOS2 users to propagate nuclear data uncertainties directly from the microscopic cross sections to subsequent core simulations.

  14. Prognostic significance of anaplasia and angiogenesis in childhood medulloblastoma: a pediatric oncology group study.

    PubMed

    Ozer, Erdener; Sarialioglu, Faik; Cetingoz, Riza; Yüceer, Nurullah; Cakmakci, Handan; Ozkal, Sermin; Olgun, Nur; Uysal, Kamer; Corapcioglu, Funda; Canda, Serefettin

    2004-01-01

    The purpose of this study was to investigate whether quantitative assessment of cytologic anaplasia and angiogenesis may predict the clinical prognosis in medulloblastoma and stratify the patients to avoid both undertreatment and overtreatment. Medulloblastomas from 23 patients belonging to the Pediatric Oncology Group were evaluated with respect to some prognostic variables, including histologic assessment of nodularity and desmoplasia, grading of anaplasia, measurement of nuclear size, mitotic cell count, quantification of angiogenesis, including vascular surface density (VSD) and microvessel number (NVES), and immunohistochemical scoring of vascular endothelial growth factor (VEGF) expression. Univariate and multivariate analyses for prognostic indicators for survival were performed. Univariate analysis revealed that extensive nodularity was a significant favorable prognostic factor, whereas the presence of anaplasia, increased nuclear size, mitotic rate, VSD, and NVES were significant unfavorable prognostic factors. Using multivariate analysis, increased nuclear size was found to be an independent unfavorable prognostic factor for survival. Neither the presence of desmoplasia nor VEGF expression was significantly related to patient survival. Although care must be taken not to overstate the importance of the results of this single-institution preliminary report, pathologic grading of medulloblastomas with respect to grading of anaplasia and quantification of nodularity, nuclear size, and microvessel profiles may be clinically useful for the treatment of medulloblastomas. Further validation of the independent prognostic significance of nuclear size in stratifying patients is required.

  15. Interest of fluorine-19 nuclear magnetic resonance spectroscopy in the detection, identification and quantification of metabolites of anticancer and antifungal fluoropyrimidine drugs in human biofluids.

    PubMed

    Martino, Robert; Gilard, Véronique; Desmoulin, Franck; Malet-Martino, Myriam

    2006-01-01

    The metabolism of fluorouracil and fluorocytosine, two 5-fluoropyrimidine drugs in clinical use, was investigated. (19)F nuclear magnetic resonance (NMR) spectroscopy was used as an analytical technique for the detection, identification and quantification of fluorinated metabolites of these drugs in intact human biofluids as well as fluorinated degradation compounds of fluorouracil in commercial vials. (19)F NMR provides a highly specific tool for the detection and absolute quantification, in a single run, of all the fluorinated species, including unexpected substances, present in biofluids of patients treated with fluorouracil or fluorocytosine. Besides the parent drug and the already known fluorinated metabolites, nine new metabolites were identified for the first time with (19)F NMR in human biofluids. Six of them can only be observed with this technique: fluoride ion, N-carboxy-alpha-fluoro-beta-alanine, alpha-fluoro-beta-alanine conjugate with deoxycholic acid, 2-fluoro-3-hydroxypropanoic acid, fluoroacetic acid, O(2)-beta-glucuronide of fluorocytosine. (19)F NMR studies of biological fluids of patients treated with anticancer fluorouracil or antifungal fluorocytosine have furthered the understanding of their catabolic pathways.

  16. Methylene blue dyeing of cellular nuclei during salpingoscopy, a new in-vivo method to evaluate vitality of tubal epithelium.

    PubMed

    Marconi, G; Quintana, R

    1998-12-01

    The Fallopian tube can be damaged by different noxious substances that may change cellular ultrastructure and function. Alteration of the cell membrane allows the passage of certain aniline dyes, which can stain the nucleus. A total of 310 Fallopian tubes from 163 patients who underwent a surgical or diagnostic laparoscopy during fertility studies was analysed by salpingoscopy. Cellular nuclei were stained by injection of 20 ml of a 10% solution of methylene blue in saline solution (NaCl 10%) through the cervical cannula prior to salpingoscopy. Evaluation of nuclear staining with methylene blue, adhesions, vascular alterations, and the flattening of folds in relation to pregnancy outcome was undertaken. Quantification of salpingoscopic findings was carried out according to a score. Flattening of folds and vascular alterations showed no difference in the pregnant and non-pregnant groups. On the other hand, adhesions and nuclear dyeing were significantly greater in the non-pregnant group (adhesions 13.6 versus 26.8%, P < 0.004, and nuclear dyeing: 25 versus 41.7%, P < 0.009, pregnant versus non-pregnant). Methylene blue dye is a new tool to evaluate in vivo cyto-histological tubal damage, and is a useful and simple method to provide a prognosis of salpingean function.

  17. Uncertainty quantification for optical model parameters

    DOE PAGES

    Lovell, A. E.; Nunes, F. M.; Sarich, J.; ...

    2017-02-21

    Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of our work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fitmore » and create corresponding 95% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. Here, we study a number of reactions involving neutron and deuteron projectiles with energies in the range of 5–25 MeV/u, on targets with mass A=12–208. We investigate the correlations between the parameters in the fit. The case of deuterons on 12C is discussed in detail: the elastic-scattering fit and the prediction of 12C(d,p) 13C transfer angular distributions, using both uncorrelated and correlated χ 2 minimization functions. The general features for all cases are compiled in a systematic manner to identify trends. This work shows that, in many cases, the correlated χ 2 functions (in comparison to the uncorrelated χ 2 functions) provide a more natural parameterization of the process. These correlated functions do, however, produce broader confidence bands. Further optimization may require improvement in the models themselves and/or more information included in the fit.« less

  18. Flow Quantification by Nuclear Magnetic Resonance Imaging

    NASA Astrophysics Data System (ADS)

    Vu, Anthony Tienhuan

    1994-01-01

    In this dissertation, a robust method for the measurement and visualization of flow field in laminar, complex and turbulent flows by Nuclear Magnetic Resonance Imaging utilizing flow induced Adiabatic Fast Passage (AFP) principle will be presented. This dissertation focuses on the application of AFP in spatially resolvable size vessels. We first review two main flow effects in NMR: time-of-flight and phase dispersion. The discussion of NMR flow imaging application - flow measurements and NMR angiography will be given. The theoretical framework of adiabatic passage will be discussed in order to explain the principle of flow-induced adiabatic passage tagging for flow imaging applications. From a knowledge of the basic flow-induced adiabatic passage principle, we propose a multi-zone AFP excitation scheme to deal with flow in a curved tube, branches and constrictions, i.e. complex and turbulent flow regimes. The technique provides a quick and simple way to acquire flow profiles simultaneously at several locations and arbitrary orientations inside the field-of-view. The flow profile is the time-averaged evolution of the labeled flowing material. Results obtained using a carotid bifurcation and circular jet phantoms are similar to the previous experimental studies employing laser Doppler Anemometry, and other flow visualization techniques. In addition, the preliminary results obtained with a human volunteer support the feasibility of the technique for in vivo flow quantification. Finally, a quantitative comparison of flow measurement of the new proposed techniques with the more established Phase Contrast MRA was performed. The results show excellent correlation between the two methods and with the standard volumetric flow rate measurement indicating that the flow measurements obtained using this technique are reliable and accurate under various flow regimes.

  19. Spectral X-ray Radiography for Safeguards at Nuclear Fuel Fabrication Facilities: A Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Andrew J.; McDonald, Benjamin S.; Smith, Leon E.

    The methods currently used by the International Atomic Energy Agency to account for nuclear materials at fuel fabrication facilities are time consuming and require in-field chemistry and operation by experts. Spectral X-ray radiography, along with advanced inverse algorithms, is an alternative inspection that could be completed noninvasively, without any in-field chemistry, with inspections of tens of seconds. The proposed inspection system and algorithms are presented here. The inverse algorithm uses total variation regularization and adaptive regularization parameter selection with the unbiased predictive risk estimator. Performance of the system is quantified with simulated X-ray inspection data and sensitivity of the outputmore » is tested against various inspection system instabilities. Material quantification from a fully-characterized inspection system is shown to be very accurate, with biases on nuclear material estimations of < 0.02%. It is shown that the results are sensitive to variations in the fuel powder sample density and detector pixel gain, which increase biases to 1%. Options to mitigate these inaccuracies are discussed.« less

  20. Inverse modelling of radionuclide release rates using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Hamburger, Thomas; Evangeliou, Nikolaos; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian

    2015-04-01

    Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. Observations and dispersion modelling of the released radionuclides help to assess the regional impact of such nuclear accidents. Modelling the increase of regional radionuclide activity concentrations, which results from nuclear accidents, underlies a multiplicity of uncertainties. One of the most significant uncertainties is the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source term may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on estimates given by the operators of the nuclear power plant. Precise measurements are mostly missing due to practical limitations during the accident. The release rates of radionuclides at the accident site can be estimated using inverse modelling (Davoine and Bocquet, 2007). The accuracy of the method depends amongst others on the availability, reliability and the resolution in time and space of the used observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates, on the other hand, are observed routinely on a much denser grid and higher temporal resolution and provide therefore a wider basis for inverse modelling (Saunier et al., 2013). We present a new inversion approach, which combines an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008, Stohl et al., 2012). The a priori information on the source term is a first guess. The gamma dose rate observations are used to improve the first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.

  1. Quantitative analysis of sitagliptin using the (19)F-NMR method: a universal technique for fluorinated compound detection.

    PubMed

    Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya

    2015-01-07

    To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.

  2. Novel Method of Quantifying Radioactive Cesium-Rich Microparticles (CsMPs) in the Environment from the Fukushima Daiichi Nuclear Power Plant.

    PubMed

    Ikehara, Ryohei; Suetake, Mizuki; Komiya, Tatsuki; Furuki, Genki; Ochiai, Asumi; Yamasaki, Shinya; Bower, William R; Law, Gareth T W; Ohnuki, Toshihiko; Grambow, Bernd; Ewing, Rodney C; Utsunomiya, Satoshi

    2018-06-05

    Highly radioactive cesium-rich microparticles (CsMPs) were released from the Fukushima Daiichi nuclear power plant (FDNPP) to the surrounding environment at an early stage of the nuclear disaster in March of 2011; however, the quantity of released CsMPs remains undetermined. Here, we report a novel method to quantify the number of CsMPs in surface soils at or around Fukushima and the fraction of radioactivity they contribute, which we call "quantification of CsMPs" (QCP) and is based on autoradiography. Here, photostimulated luminescence (PSL) is linearly correlated to the radioactivity of various microparticles, with a regression coefficient of 0.0523 becquerel/PSL/h (Bq/PSL/h). In soil collected from Nagadoro, Fukushima, Japan, CsMPs were detected in soil sieved with a 114 μm mesh. There was no overlap between the radioactivities of CsMPs and clay particles adsorbing Cs. Based on the distribution of radioactivity of CsMPs, the threshold radioactivity of CsMPs in the size fraction of <114 μm was determined to be 0.06 Bq. Based on this method, the number and radioactivity fraction of CsMPs in four surface soils collected from the vicinity of the FDNPP were determined to be 48-318 particles per gram and 8.53-31.8%, respectively. The QCP method is applicable to soils with a total radioactivity as high as ∼10 6 Bq/kg. This novel method is critically important and can be used to quantitatively understand the distribution and migration of the highly radioactive CsMPs in near-surface environments surrounding Fukushima.

  3. [Comparison of three histometric methods for the comprehension of stimulating effects on the rat thyroid gland].

    PubMed

    Herrmann, F; Hambsch, K; Wolf, T; Rother, P; Müller, P

    1989-01-01

    There exist some histometric methods for the morphological quantification of different strongly stimulating effects on the thyroid gland induced by drugs and/or other chemical substances in dependence upon dose and duration of application. But in respect of technical and temporal expense and also diagnostic statement, there are considerable differences between these recording procedures. Therefore we examined the 3 mostly used methods synchronously (i.e. determination of thyroid epithelial cell height, nuclear volume in thyrocytes, and estimation of relative volume parts in the thyroid gland by the point counting method) by investigating the thyroid glands of methylthiouracil-(MTU)-stimulated rats and corresponding controls in order to compare the diagnostic value and temporal expense. The largest temporal expense was required in the nuclear volume determination, the smallest in the point-counting method. On principle, all 3 procedures allow the determination of hypertrophic alterations but only by help of the point-counting method, also hyperplastic changes are recognizable. By nuclear volume determination, we found significant differences between central and peripheral parts of the thyroid gland. Therefore, to avoid the subjective error, it will be necessary to measure a large number of nuclei in many planes of the gland. Also the determination of epithelial cell high reinforces the subjective error because of the heterological structure especially in unstimulated thyroid gland. If the number of counting points is exactly determined and, full of sense, limited, the point-counting method allows a nearly complete measuring of the whole object to be tested within an acceptable investigation time. In this way, the heterological structure of thyroid gland will be regarded, and comparability and reproducibility are guaranteed on an high level.

  4. High-Throughput Live-Cell Microscopy Analysis of Association Between Chromosome Domains and the Nucleolus in S. cerevisiae.

    PubMed

    Wang, Renjie; Normand, Christophe; Gadal, Olivier

    2016-01-01

    Spatial organization of the genome has important impacts on all aspects of chromosome biology, including transcription, replication, and DNA repair. Frequent interactions of some chromosome domains with specific nuclear compartments, such as the nucleolus, are now well documented using genome-scale methods. However, direct measurement of distance and interaction frequency between loci requires microscopic observation of specific genomic domains and the nucleolus, followed by image analysis to allow quantification. The fluorescent repressor operator system (FROS) is an invaluable method to fluorescently tag DNA sequences and investigate chromosome position and dynamics in living cells. This chapter describes a combination of methods to define motion and region of confinement of a locus relative to the nucleolus in cell's nucleus, from fluorescence acquisition to automated image analysis using two dedicated pipelines.

  5. [Principles of PET].

    PubMed

    Beuthien-Baumann, B

    2018-05-01

    Positron emission tomography (PET) is a procedure in nuclear medicine, which is applied predominantly in oncological diagnostics. In the form of modern hybrid machines, such as PET computed tomography (PET/CT) and PET magnetic resonance imaging (PET/MRI) it has found wide acceptance and availability. The PET procedure is more than just another imaging technique, but a functional method with the capability for quantification in addition to the distribution pattern of the radiopharmaceutical, the results of which are used for therapeutic decisions. A profound knowledge of the principles of PET including the correct indications, patient preparation, and possible artifacts is mandatory for the correct interpretation of PET results.

  6. Development and evaluation of a technique for in vivo monitoring of 60Co in human liver

    NASA Astrophysics Data System (ADS)

    Gomes, GH; Silva, MC; Mello, JQ; Dantas, ALA; Dantas, BM

    2018-03-01

    60Co is an artificial radioactive metal produced by activation of iron with neutrons. It decays by beta particles and gamma radiation and represents a risk of internal exposure of workers involved in the maintenance of nuclear power reactors. Intakes can be quantified through in vivo monitoring. This work describes the development of a technique for the quantification of 60Co in human liver. The sensitivity of the method is evaluated based on the minimum detectable effective doses. The results allow to state that the technique is suitable either for monitoring of occupational exposures or evaluation of accidental intakes.

  7. Quantification of left to right atrial shunts with velocity-encoded cine nuclear magnetic resonance imaging.

    PubMed

    Brenner, L D; Caputo, G R; Mostbeck, G; Steiman, D; Dulce, M; Cheitlin, M D; O'Sullivan, M; Higgins, C B

    1992-11-01

    The purpose of this study was to evaluate the ability of velocity-encoded nuclear magnetic resonance (NMR) imaging to quantify left to right intracardiac shunts in patients with an atrial septal defect. Quantification of intracardiac shunts is clinically important in planning therapy. Velocity-encoded NMR imaging was used to quantify stroke flow in the aorta and in the main pulmonary artery in a group of patients who were known to have an increased pulmonary to systemic flow ratio (Qp/Qs). The velocity-encoded NMR flow data were used to calculate Qp/Qs, and these values were compared with measurements of Qp/Qs obtained with oximetric data derived from cardiac catheterization and from stroke volume measurements of the two ventricles by using volumetric data from biphasic spin echo and cine NMR images obtained at end-diastole and end-systole. Two independent observers measured Qp/Qs by using velocity-encoded NMR imaging in 11 patients and found Qp/Qs ranging from 1.4:1 to 3.9:1. These measurements correlated well with both oximetric data (r = 0.91, SEE = 0.35) and ventricular volumetric data (r = 0.94, SEE = 0.30). Interobserver reproducibility for Qp/Qs by velocity-encoded NMR imaging was good (r = 0.97, SEE = 0.20). Velocity-encoded NMR imaging is an accurate and reproducible method for measuring Qp/Qs in left to right shunts. Because it is completely noninvasive, it can be used to monitor shunt volume over time.

  8. A method for limiting data acquisition in a high-resolution gamma-ray spectrometer during On-Site Inspection activities under the Comprehensive Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Aviv, O.; Lipshtat, A.

    2018-05-01

    On-Site Inspection (OSI) activities under the Comprehensive Nuclear-Test-Ban Treaty (CTBT) allow limitations to measurement equipment. Thus, certain detectors require modifications to be operated in a restricted mode. The accuracy and reliability of results obtained by a restricted device may be impaired. We present here a method for limiting data acquisition during OSI. Limitations are applied to a high-resolution high-purity germanium detector system, where the vast majority of the acquired data that is not relevant to the inspection is filtered out. The limited spectrum is displayed to the user and allows analysis using standard gamma spectrometry procedures. The proposed method can be incorporated into commercial gamma-ray spectrometers, including both stationary and mobile-based systems. By applying this procedure to more than 1000 spectra, representing various scenarios, we show that partial data are sufficient for reaching reliable conclusions. A comprehensive survey of potential false-positive identifications of various radionuclides is presented as well. It is evident from the results that the analysis of a limited spectrum is practically identical to that of a standard spectrum in terms of detection and quantification of OSI-relevant radionuclides. A future limited system can be developed making use of the principles outlined by the suggested method.

  9. Systematic Comparison of Label-Free, Metabolic Labeling, and Isobaric Chemical Labeling for Quantitative Proteomics on LTQ Orbitrap Velos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhou; Adams, Rachel M; Chourey, Karuna

    2012-01-01

    A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less

  10. Species Identification of Fox-, Mink-, Dog-, and Rabbit-Derived Ingredients by Multiplex PCR and Real-Time PCR Assay.

    PubMed

    Wu, Qingqing; Xiang, Shengnan; Wang, Wenjun; Zhao, Jinyan; Xia, Jinhua; Zhen, Yueran; Liu, Bang

    2018-05-01

    Various detection methods have been developed to date for identification of animal species. New techniques based on PCR approach have raised the hope of developing better identification methods, which can overcome the limitations of the existing methods. PCR-based methods used the mitochondrial DNA (mtDNA) as well as nuclear DNA sequences. In this study, by targeting nuclear DNA, multiplex PCR and real-time PCR methods were developed to assist with qualitative and quantitative analysis. The multiplex PCR was found to simultaneously and effectively distinguish four species (fox, dog, mink, and rabbit) ingredients by the different sizes of electrophoretic bands: 480, 317, 220, and 209 bp. Real-time fluorescent PCR's amplification profiles and standard curves showed good quantitative measurement responses and linearity, as indicated by good repeatability and coefficient of determination R 2  > 0.99. The quantitative results of quaternary DNA mixtures including mink, fox, dog, and rabbit DNA are in line with our expectations: R.D. (relative deviation) varied between 1.98 and 12.23% and R.S.D. (relative standard deviation) varied between 3.06 and 11.51%, both of which are well within the acceptance criterion of ≤ 25%. Combining the two methods is suitable for the rapid identification and accurate quantification of fox-, dog-, mink-, and rabbit-derived ingredients in the animal products.

  11. Age of heart disease presentation and dysmorphic nuclei in patients with LMNA mutations

    PubMed Central

    Core, Jason Q.; Mehrabi, Mehrsa; Robinson, Zachery R.; Ochs, Alexander R.; McCarthy, Linda A.; Zaragoza, Michael V.

    2017-01-01

    Nuclear shape defects are a distinguishing characteristic in laminopathies, cancers, and other pathologies. Correlating these defects to the symptoms, mechanisms, and progression of disease requires unbiased, quantitative, and high-throughput means of quantifying nuclear morphology. To accomplish this, we developed a method of automatically segmenting fluorescently stained nuclei in 2D microscopy images and then classifying them as normal or dysmorphic based on three geometric features of the nucleus using a package of Matlab codes. As a test case, cultured skin-fibroblast nuclei of individuals possessing LMNA splice-site mutation (c.357-2A>G), LMNA nonsense mutation (c.736 C>T, pQ246X) in exon 4, LMNA missense mutation (c.1003C>T, pR335W) in exon 6, Hutchinson-Gilford Progeria Syndrome, and no LMNA mutations were analyzed. For each cell type, the percentage of dysmorphic nuclei, and other morphological features such as average nuclear area and average eccentricity were obtained. Compared to blind observers, our procedure implemented in Matlab codes possessed similar accuracy to manual counting of dysmorphic nuclei while being significantly more consistent. The automatic quantification of nuclear defects revealed a correlation between in vitro results and age of patients for initial symptom onset. Our results demonstrate the method’s utility in experimental studies of diseases affecting nuclear shape through automated, unbiased, and accurate identification of dysmorphic nuclei. PMID:29149195

  12. A validated Fourier transform infrared spectroscopy method for quantification of total lactones in Inula racemosa and Andrographis paniculata.

    PubMed

    Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil

    2012-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.

  13. Alkylpyridiniums. 2. Isolation and quantification in roasted and ground coffees.

    PubMed

    Stadler, Richard H; Varga, Natalia; Milo, Christian; Schilter, Benoit; Vera, Francia Arce; Welti, Dieter H

    2002-02-27

    Recent model studies on trigonelline decomposition have identified nonvolatile alkylpyridiniums as major reaction products under certain physicochemical conditions. The quaternary base 1-methylpyridinium was isolated from roasted and ground coffee and purified by ion exchange and thin-layer chromatography. The compound was characterized by nuclear magnetic resonance spectroscopy ((1)H, (13)C) and mass spectrometry techniques. A liquid chromatography-electrospray ionization tandem mass spectrometry method was developed to quantify the alkaloid in coffee by isotope dilution mass spectrometry. The formation of alkylpyridiniums is positively correlated to the roasting degree in arabica coffee, and highest levels of 1-methylpyridinium, reaching up to 0.25% on a per weight basis, were found in dark roasted coffee beans. Analyses of coffee extracts also showed the presence of dimethylpyridinium, at concentrations ranging from 5 to 25 mg/kg. This is the first report on the isolation and quantification of alkylpyridiniums in coffee. These compounds, described here in detail for the first time, may have an impact on the flavor/aroma profile of coffee directly (e.g., bitterness), or indirectly as precursors, and potentially open new avenues in the flavor/aroma modulation of coffee.

  14. Availability of Neutronics Benchmarks in the ICSBEP and IRPhEP Handbooks for Computational Tools Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bess, John D.; Briggs, J. Blair; Ivanova, Tatiana

    2017-02-01

    In the past several decades, numerous experiments have been performed worldwide to support reactor operations, measurements, design, and nuclear safety. Those experiments represent an extensive international investment in infrastructure, expertise, and cost, representing significantly valuable resources of data supporting past, current, and future research activities. Those valuable assets represent the basis for recording, development, and validation of our nuclear methods and integral nuclear data [1]. The loss of these experimental data, which has occurred all too much in the recent years, is tragic. The high cost to repeat many of these measurements can be prohibitive, if not impossible, to surmount.more » Two international projects were developed, and are under the direction of the Organisation for Co-operation and Development Nuclear Energy Agency (OECD NEA) to address the challenges of not just data preservation, but evaluation of the data to determine its merit for modern and future use. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was established to identify and verify comprehensive critical benchmark data sets; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data [2]. Similarly, the International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications [3]. Annually, contributors from around the world continue to collaborate in the evaluation and review of select benchmark experiments for preservation and dissemination. The extensively peer-reviewed integral benchmark data can then be utilized to support nuclear design and safety analysts to validate the analytical tools, methods, and data needed for next-generation reactor design, safety analysis requirements, and all other front- and back-end activities contributing to the overall nuclear fuel cycle where quality neutronics calculations are paramount.« less

  15. Validating neural-network refinements of nuclear mass models

    NASA Astrophysics Data System (ADS)

    Utama, R.; Piekarewicz, J.

    2018-01-01

    Background: Nuclear astrophysics centers on the role of nuclear physics in the cosmos. In particular, nuclear masses at the limits of stability are critical in the development of stellar structure and the origin of the elements. Purpose: We aim to test and validate the predictions of recently refined nuclear mass models against the newly published AME2016 compilation. Methods: The basic paradigm underlining the recently refined nuclear mass models is based on existing state-of-the-art models that are subsequently refined through the training of an artificial neural network. Bayesian inference is used to determine the parameters of the neural network so that statistical uncertainties are provided for all model predictions. Results: We observe a significant improvement in the Bayesian neural network (BNN) predictions relative to the corresponding "bare" models when compared to the nearly 50 new masses reported in the AME2016 compilation. Further, AME2016 estimates for the handful of impactful isotopes in the determination of r -process abundances are found to be in fairly good agreement with our theoretical predictions. Indeed, the BNN-improved Duflo-Zuker model predicts a root-mean-square deviation relative to experiment of σrms≃400 keV. Conclusions: Given the excellent performance of the BNN refinement in confronting the recently published AME2016 compilation, we are confident of its critical role in our quest for mass models of the highest quality. Moreover, as uncertainty quantification is at the core of the BNN approach, the improved mass models are in a unique position to identify those nuclei that will have the strongest impact in resolving some of the outstanding questions in nuclear astrophysics.

  16. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  17. Quantification of the Spatial Organization of the Nuclear Lamina as a Tool for Cell Classification

    PubMed Central

    Righolt, Christiaan H.; Zatreanu, Diana A.; Raz, Vered

    2013-01-01

    The nuclear lamina is the structural scaffold of the nuclear envelope that plays multiple regulatory roles in chromatin organization and gene expression as well as a structural role in nuclear stability. The lamina proteins, also referred to as lamins, determine nuclear lamina organization and define the nuclear shape and the structural integrity of the cell nucleus. In addition, lamins are connected with both nuclear and cytoplasmic structures forming a dynamic cellular structure whose shape changes upon external and internal signals. When bound to the nuclear lamina, the lamins are mobile, have an impact on the nuclear envelop structure, and may induce changes in their regulatory functions. Changes in the nuclear lamina shape cause changes in cellular functions. A quantitative description of these structural changes could provide an unbiased description of changes in cellular function. In this review, we describe how changes in the nuclear lamina can be measured from three-dimensional images of lamins at the nuclear envelope, and we discuss how structural changes of the nuclear lamina can be used for cell classification. PMID:27335676

  18. Quantification of the Spatial Organization of the Nuclear Lamina as a Tool for Cell Classification.

    PubMed

    Righolt, Christiaan H; Zatreanu, Diana A; Raz, Vered

    2013-01-01

    The nuclear lamina is the structural scaffold of the nuclear envelope that plays multiple regulatory roles in chromatin organization and gene expression as well as a structural role in nuclear stability. The lamina proteins, also referred to as lamins, determine nuclear lamina organization and define the nuclear shape and the structural integrity of the cell nucleus. In addition, lamins are connected with both nuclear and cytoplasmic structures forming a dynamic cellular structure whose shape changes upon external and internal signals. When bound to the nuclear lamina, the lamins are mobile, have an impact on the nuclear envelop structure, and may induce changes in their regulatory functions. Changes in the nuclear lamina shape cause changes in cellular functions. A quantitative description of these structural changes could provide an unbiased description of changes in cellular function. In this review, we describe how changes in the nuclear lamina can be measured from three-dimensional images of lamins at the nuclear envelope, and we discuss how structural changes of the nuclear lamina can be used for cell classification.

  19. Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.

    PubMed

    Mujika, Iñigo

    2017-04-01

    Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.

  20. Automated measurement of estrogen receptor in breast cancer: a comparison of fluorescent and chromogenic methods of measurement

    PubMed Central

    Zarrella, Elizabeth; Coulter, Madeline; Welsh, Allison; Carvajal, Daniel; Schalper, Kurt; Harigopal, Malini; Rimm, David; Neumeister, Veronique

    2016-01-01

    While FDA approved methods of assessment of Estrogen Receptor (ER) are “fit for purpose”, they represent a 30-year-old technology. New quantitative methods, both chromogenic and fluorescent, have been developed and studies have shown that these methods increase the accuracy of assessment of ER. Here, we compare three methods of ER detection and assessment on two retrospective tissue microarray cohorts of breast cancer patients: estimates of percent nuclei positive by pathologists and by Aperio’s nuclear algorithm (standard chromogenic immunostaining), and immunofluorescence as quantified with the AQUA® method of quantitative immunofluorescence (QIF). Reproducibility was excellent (R2 > 0.95) between users for both automated analysis methods, and the Aperio and QIF scoring results were also highly correlated, despite the different detection systems. The subjective readings show lower levels of reproducibility and a discontinuous, bimodal distribution of scores not seen by either mechanized method. Kaplan-Meier analysis of 10-year disease-free survival was significant for each method (Pathologist, P=0.0019; Aperio, P=0.0053, AQUA, P=0.0026), but there were discrepancies in patient classification in 19 out of 233 cases analyzed. Out of these, 11 were visually positive by both chromogenic and fluorescent detection. In 10 cases, the Aperio nuclear algorithm labeled the nuclei as negative, in 1 case, the AQUA score was just under the cutoff for positivity (determined by an Index TMA). In contrast, 8 out of 19 discrepant cases had clear nuclear positivity by fluorescence that was unable to be visualized by chromogenic detection, perhaps due to low positivity masked by the hematoxylin counterstain. These results demonstrate that automated systems enable objective, precise quantification of ER. Furthermore immunofluorescence detection offers the additional advantage of a signal that cannot be masked by a counterstaining agent. These data support the usage of automated methods for measurement of this and other biomarkers that may be used in companion diagnostic tests. PMID:27348626

  1. Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).

    PubMed

    Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd

    2011-01-01

    The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.

  2. Normal values and standardization of parameters in nuclear cardiology: Japanese Society of Nuclear Medicine working group database.

    PubMed

    Nakajima, Kenichi; Matsumoto, Naoya; Kasai, Tokuo; Matsuo, Shinro; Kiso, Keisuke; Okuda, Koichi

    2016-04-01

    As a 2-year project of the Japanese Society of Nuclear Medicine working group activity, normal myocardial imaging databases were accumulated and summarized. Stress-rest with gated and non-gated image sets were accumulated for myocardial perfusion imaging and could be used for perfusion defect scoring and normal left ventricular (LV) function analysis. For single-photon emission computed tomography (SPECT) with multi-focal collimator design, databases of supine and prone positions and computed tomography (CT)-based attenuation correction were created. The CT-based correction provided similar perfusion patterns between genders. In phase analysis of gated myocardial perfusion SPECT, a new approach for analyzing dyssynchrony, normal ranges of parameters for phase bandwidth, standard deviation and entropy were determined in four software programs. Although the results were not interchangeable, dependency on gender, ejection fraction and volumes were common characteristics of these parameters. Standardization of (123)I-MIBG sympathetic imaging was performed regarding heart-to-mediastinum ratio (HMR) using a calibration phantom method. The HMRs from any collimator types could be converted to the value with medium-energy comparable collimators. Appropriate quantification based on common normal databases and standard technology could play a pivotal role for clinical practice and researches.

  3. Effects of Image Compression on Automatic Count of Immunohistochemically Stained Nuclei in Digital Images

    PubMed Central

    López, Carlos; Lejeune, Marylène; Escrivà, Patricia; Bosch, Ramón; Salvadó, Maria Teresa; Pons, Lluis E.; Baucells, Jordi; Cugat, Xavier; Álvaro, Tomás; Jaén, Joaquín

    2008-01-01

    This study investigates the effects of digital image compression on automatic quantification of immunohistochemical nuclear markers. We examined 188 images with a previously validated computer-assisted analysis system. A first group was composed of 47 images captured in TIFF format, and other three contained the same images converted from TIFF to JPEG format with 3×, 23× and 46× compression. Counts of TIFF format images were compared with the other three groups. Overall, differences in the count of the images increased with the percentage of compression. Low-complexity images (≤100 cells/field, without clusters or with small-area clusters) had small differences (<5 cells/field in 95–100% of cases) and high-complexity images showed substantial differences (<35–50 cells/field in 95–100% of cases). Compression does not compromise the accuracy of immunohistochemical nuclear marker counts obtained by computer-assisted analysis systems for digital images with low complexity and could be an efficient method for storing these images. PMID:18755997

  4. A novel algorithm for solving the true coincident counting issues in Monte Carlo simulations for radiation spectroscopy.

    PubMed

    Guan, Fada; Johns, Jesse M; Vasudevan, Latha; Zhang, Guoqing; Tang, Xiaobin; Poston, John W; Braby, Leslie A

    2015-06-01

    Coincident counts can be observed in experimental radiation spectroscopy. Accurate quantification of the radiation source requires the detection efficiency of the spectrometer, which is often experimentally determined. However, Monte Carlo analysis can be used to supplement experimental approaches to determine the detection efficiency a priori. The traditional Monte Carlo method overestimates the detection efficiency as a result of omitting coincident counts caused mainly by multiple cascade source particles. In this study, a novel "multi-primary coincident counting" algorithm was developed using the Geant4 Monte Carlo simulation toolkit. A high-purity Germanium detector for ⁶⁰Co gamma-ray spectroscopy problems was accurately modeled to validate the developed algorithm. The simulated pulse height spectrum agreed well qualitatively with the measured spectrum obtained using the high-purity Germanium detector. The developed algorithm can be extended to other applications, with a particular emphasis on challenging radiation fields, such as counting multiple types of coincident radiations released from nuclear fission or used nuclear fuel.

  5. Synthesis and characterisation of PuPO4 - a potential analytical standard for EPMA actinide quantification

    NASA Astrophysics Data System (ADS)

    Wright, K. E.; Popa, K.; Pöml, P.

    2018-01-01

    Transmutation nuclear fuels contain weight percentage quantities of actinide elements, including Pu, Am and Np. Because of the complex spectra presented by actinide elements using electron probe microanalysis (EPMA), it is necessary to have relatively pure actinide element standards to facilitate overlap correction and accurate quantitation. Synthesis of actinide oxide standards is complicated by their multiple oxidation states, which can result in inhomogeneous standards or standards that are not stable at atmospheric conditions. Synthesis of PuP4 results in a specimen that exhibits stable oxidation-reduction chemistry and is sufficiently homogenous to serve as an EPMA standard. This approach shows promise as a method for producing viable actinide standards for microanalysis.

  6. Quantitative Evaluation of Segmentation- and Atlas-Based Attenuation Correction for PET/MR on Pediatric Patients.

    PubMed

    Bezrukov, Ilja; Schmidt, Holger; Gatidis, Sergios; Mantlik, Frédéric; Schäfer, Jürgen F; Schwenzer, Nina; Pichler, Bernd J

    2015-07-01

    Pediatric imaging is regarded as a key application for combined PET/MR imaging systems. Because existing MR-based attenuation-correction methods were not designed specifically for pediatric patients, we assessed the impact of 2 potentially influential factors: inter- and intrapatient variability of attenuation coefficients and anatomic variability. Furthermore, we evaluated the quantification accuracy of 3 methods for MR-based attenuation correction without (SEGbase) and with bone prediction using an adult and a pediatric atlas (SEGwBONEad and SEGwBONEpe, respectively) on PET data of pediatric patients. The variability of attenuation coefficients between and within pediatric (5-17 y, n = 17) and adult (27-66 y, n = 16) patient collectives was assessed on volumes of interest (VOIs) in CT datasets for different tissue types. Anatomic variability was assessed on SEGwBONEad/pe attenuation maps by computing mean differences to CT-based attenuation maps for regions of bone tissue, lungs, and soft tissue. PET quantification was evaluated on VOIs with physiologic uptake and on 80% isocontour VOIs with elevated uptake in the thorax and abdomen/pelvis. Inter- and intrapatient variability of the bias was assessed for each VOI group and method. Statistically significant differences in mean VOI Hounsfield unit values and linear attenuation coefficients between adult and pediatric collectives were found in the lungs and femur. The prediction of attenuation maps using the pediatric atlas showed a reduced error in bone tissue and better delineation of bone structure. Evaluation of PET quantification accuracy showed statistically significant mean errors in mean standardized uptake values of -14% ± 5% and -23% ± 6% in bone marrow and femur-adjacent VOIs with physiologic uptake for SEGbase, which could be reduced to 0% ± 4% and -1% ± 5% using SEGwBONEpe attenuation maps. Bias in soft-tissue VOIs was less than 5% for all methods. Lung VOIs showed high SDs in the range of 15% for all methods. For VOIs with elevated uptake, mean and SD were less than 5% except in the thorax. The use of a dedicated atlas for the pediatric patient collective resulted in improved attenuation map prediction in osseous regions and reduced interpatient bias variation in femur-adjacent VOIs. For the lungs, in which intrapatient variation was higher for the pediatric collective, a patient- or group-specific attenuation coefficient might improve attenuation map accuracy. Mean errors of -14% and -23% in bone marrow and femur-adjacent VOIs can affect PET quantification in these regions when bone tissue is ignored. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  7. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  8. Spot quantification in two dimensional gel electrophoresis image analysis: comparison of different approaches and presentation of a novel compound fitting algorithm

    PubMed Central

    2014-01-01

    Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860

  9. Analysis of sesquiterpene lactones, lignans, and flavonoids in wormwood (Artemisia absinthium L.) using high-performance liquid chromatography (HPLC)-mass spectrometry, reversed phase HPLC, and HPLC-solid phase extraction-nuclear magnetic resonance.

    PubMed

    Aberham, Anita; Cicek, Serhat Sezai; Schneider, Peter; Stuppner, Hermann

    2010-10-27

    Today, the medicinal use of wormwood (Artemisia absinthium) is enjoying a resurgence of popularity. This study presents a specific and validated high-performance liquid chromatography (HPLC)-diode array detection method for the simultaneous determination and quantification of bioactive compounds in wormwood and commercial preparations thereof. Five sesquiterpene lactones, two lignans, and a polymethoxylated flavonoid were baseline separated on RP-18 material, using a solvent gradient consisting of 0.085% (v/v) o-phosphoric acid and acetonitrile. The flow rate was 1.0 mL/min, and chromatograms were recorded at 205 nm. The stability of absinthin was tested exposing samples to light, moisture, and different temperatures. Methanolic and aqueous solutions of absinthin were found to be stable for up to 6 months. This was also the case when the solid compound was kept in the refrigerator at -35 °C. In contrast, the colorless needles, when stored at room temperature, turned yellow. Three degradation compounds (anabsin, anabsinthin, and the new dimer 3'-hydroxyanabsinthin) were identified by HPLC-mass spectrometry and HPLC-solid-phase extraction-nuclear magnetic resonance and quantified by the established HPLC method.

  10. 237Np analytical method using 239Np tracers and application to a contaminated nuclear disposal facility

    DOE PAGES

    Snow, Mathew S.; Morrison, Samuel S.; Clark, Sue B.; ...

    2017-03-21

    In this study, environmental 237Np analyses are challenged by low 237Np concentrations and lack of an available yield tracer; we report a rapid, inexpensive 237Np analytical approach employing the short lived 239Np (t1/2 = 2.3 days) as a chemical yield tracer followed by 237Np quantification using inductively coupled plasma-mass spectrometry. 239Np tracer is obtained via separation from a 243Am stock solution and standardized using gamma spectrometry immediately prior to sample processing. Rapid digestions using a commercial, 900 W "Walmart" microwave and Parr microwave vessels result in 99.8 ± 0.1% digestion yields, while chromatographic separations enable Np/U separation factors on themore » order of 10 6 and total Np yields of 95 ± 4% (2σ). Application of this method to legacy soil samples surrounding a radioactive disposal facility (the Subsurface Disposal Area at Idaho National Laboratory) reveal the presence of low level 237Np contamination within 600 m of this site, with maximum 237Np concentrations on the order of 10 3 times greater than nuclear weapons testing fallout levels.« less

  11. 237 Np analytical method using 239 Np tracers and application to a contaminated nuclear disposal facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Mathew S.; Morrison, Samuel S.; Clark, Sue B.

    2017-06-01

    Environmental 237Np analyses are challenged by low 237Np concentrations and lack of an available yield tracer; we report a rapid, inexpensive 237Np analytical approach employing the short lived 239Np (t1/2 = 2.3 days) as a chemical yield tracer followed by 237Np quantification using inductively coupled plasma-mass spectrometry. 239Np tracer is obtained via separation from a 243Am stock solution and standardized using gamma spectrometry immediately prior to sample processing. Rapid digestions using a commercial, 900 watt “Walmart” microwave and Parr microwave vessels result in 99.8 ± 0.1% digestion yields, while chromatographic separations enable Np/U separation factors on the order of 106more » and total Np yields of 95 ± 4% (2σ). Application of this method to legacy soil samples surrounding a radioactive disposal facility (the Subsurface Disposal Area at Idaho National Laboratory) reveal the presence of low level 237Np contamination within 600 meters of this site, with maximum 237Np concentrations on the order of 103 times greater than nuclear weapons testing fallout levels.« less

  12. 237Np analytical method using 239Np tracers and application to a contaminated nuclear disposal facility.

    PubMed

    Snow, Mathew S; Morrison, Samuel S; Clark, Sue B; Olson, John E; Watrous, Matthew G

    2017-06-01

    Environmental 237 Np analyses are challenged by low 237 Np concentrations and lack of an available yield tracer; we report a rapid, inexpensive 237 Np analytical approach employing the short lived 239 Np (t 1/2  = 2.3 days) as a chemical yield tracer followed by 237 Np quantification using inductively coupled plasma-mass spectrometry. 239 Np tracer is obtained via separation from a 243 Am stock solution and standardized using gamma spectrometry immediately prior to sample processing. Rapid digestions using a commercial, 900 W "Walmart" microwave and Parr microwave vessels result in 99.8 ± 0.1% digestion yields, while chromatographic separations enable Np/U separation factors on the order of 10 6 and total Np yields of 95 ± 4% (2σ). Application of this method to legacy soil samples surrounding a radioactive disposal facility (the Subsurface Disposal Area at Idaho National Laboratory) reveal the presence of low level 237 Np contamination within 600 m of this site, with maximum 237 Np concentrations on the order of 10 3 times greater than nuclear weapons testing fallout levels. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A Novel ImageJ Macro for Automated Cell Death Quantitation in the Retina

    PubMed Central

    Maidana, Daniel E.; Tsoka, Pavlina; Tian, Bo; Dib, Bernard; Matsumoto, Hidetaka; Kataoka, Keiko; Lin, Haijiang; Miller, Joan W.; Vavvas, Demetrios G.

    2015-01-01

    Purpose TUNEL assay is widely used to evaluate cell death. Quantification of TUNEL-positive (TUNEL+) cells in tissue sections is usually performed manually, ideally by two masked observers. This process is time consuming, prone to measurement errors, and not entirely reproducible. In this paper, we describe an automated quantification approach to address these difficulties. Methods We developed an ImageJ macro to quantitate cell death by TUNEL assay in retinal cross-section images. The script was coded using IJ1 programming language. To validate this tool, we selected a dataset of TUNEL assay digital images, calculated layer area and cell count manually (done by two observers), and compared measurements between observers and macro results. Results The automated macro segmented outer nuclear layer (ONL) and inner nuclear layer (INL) successfully. Automated TUNEL+ cell counts were in-between counts of inexperienced and experienced observers. The intraobserver coefficient of variation (COV) ranged from 13.09% to 25.20%. The COV between both observers was 51.11 ± 25.83% for the ONL and 56.07 ± 24.03% for the INL. Comparing observers' results with macro results, COV was 23.37 ± 15.97% for the ONL and 23.44 ± 18.56% for the INL. Conclusions We developed and validated an ImageJ macro that can be used as an accurate and precise quantitative tool for retina researchers to achieve repeatable, unbiased, fast, and accurate cell death quantitation. We believe that this standardized measurement tool could be advantageous to compare results across different research groups, as it is freely available as open source. PMID:26469755

  14. Computational analysis of PET by AIBL (CapAIBL): a cloud-based processing pipeline for the quantification of PET images

    NASA Astrophysics Data System (ADS)

    Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier

    2015-03-01

    With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.

  15. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  16. RNA-Skim: a rapid method for RNA-Seq quantification at transcript level

    PubMed Central

    Zhang, Zhaojun; Wang, Wei

    2014-01-01

    Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995

  17. Differential Nuclear and Mitochondrial DNA Preservation in Post-Mortem Teeth with Implications for Forensic and Ancient DNA Studies

    PubMed Central

    Higgins, Denice; Rohrlach, Adam B.; Kaidonis, John; Townsend, Grant; Austin, Jeremy J.

    2015-01-01

    Major advances in genetic analysis of skeletal remains have been made over the last decade, primarily due to improvements in post-DNA-extraction techniques. Despite this, a key challenge for DNA analysis of skeletal remains is the limited yield of DNA recovered from these poorly preserved samples. Enhanced DNA recovery by improved sampling and extraction techniques would allow further advancements. However, little is known about the post-mortem kinetics of DNA degradation and whether the rate of degradation varies between nuclear and mitochondrial DNA or across different skeletal tissues. This knowledge, along with information regarding ante-mortem DNA distribution within skeletal elements, would inform sampling protocols facilitating development of improved extraction processes. Here we present a combined genetic and histological examination of DNA content and rates of DNA degradation in the different tooth tissues of 150 human molars over short-medium post-mortem intervals. DNA was extracted from coronal dentine, root dentine, cementum and pulp of 114 teeth via a silica column method and the remaining 36 teeth were examined histologically. Real time quantification assays based on two nuclear DNA fragments (67 bp and 156 bp) and one mitochondrial DNA fragment (77 bp) showed nuclear and mitochondrial DNA degraded exponentially, but at different rates, depending on post-mortem interval and soil temperature. In contrast to previous studies, we identified differential survival of nuclear and mtDNA in different tooth tissues. Futhermore histological examination showed pulp and dentine were rapidly affected by loss of structural integrity, and pulp was completely destroyed in a relatively short time period. Conversely, cementum showed little structural change over the same time period. Finally, we confirm that targeted sampling of cementum from teeth buried for up to 16 months can provide a reliable source of nuclear DNA for STR-based genotyping using standard extraction methods, without the need for specialised equipment or large-volume demineralisation steps. PMID:25992635

  18. Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.

    PubMed

    Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian

    2017-04-01

    Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques revealed a distinct relation of measured collagen and utilized quantification method as well as section thickness. Besides electron microscopy-stereology, which was precise and sensitive, light microscopy-stereology and automated image analysis proved to be appropriate for collagen quantification. Moreover, consideration of collagen localization might be important in revealing minor fibrotic changes. Copyright © 2017 the American Physiological Society.

  19. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less

  20. WGA-based lectin affinity gel electrophoresis: A novel method for the detection of O-GlcNAc-modified proteins.

    PubMed

    Kubota, Yuji; Fujioka, Ko; Takekawa, Mutsuhiro

    2017-01-01

    Post-translational modification with O-linked β-N-acetylglucosamine (O-GlcNAc) occurs selectively on serine and/or threonine residues of cytoplasmic and nuclear proteins, and dynamically regulates their molecular functions. Since conventional strategies to evaluate the O-GlcNAcylation level of a specific protein require time-consuming steps, the development of a rapid and easy method for the detection and quantification of an O-GlcNAcylated protein has been a challenging issue. Here, we describe a novel method in which O-GlcNAcylated and non-O-GlcNAcylated forms of proteins are separated by lectin affinity gel electrophoresis using wheat germ agglutinin (WGA), which primarily binds to N-acetylglucosamine residues. Electrophoresis of cell lysates through a gel containing copolymerized WGA selectively induced retardation of the mobility of O-GlcNAcylated proteins, thereby allowing the simultaneous visualization of both the O-GlcNAcylated and the unmodified forms of proteins. This method is therefore useful for the quantitative detection of O-GlcNAcylated proteins.

  1. Impact of time-of-flight PET on quantification errors in MR imaging-based attenuation correction.

    PubMed

    Mehranian, Abolfazl; Zaidi, Habib

    2015-04-01

    Time-of-flight (TOF) PET/MR imaging is an emerging imaging technology with great capabilities offered by TOF to improve image quality and lesion detectability. We assessed, for the first time, the impact of TOF image reconstruction on PET quantification errors induced by MR imaging-based attenuation correction (MRAC) using simulation and clinical PET/CT studies. Standard 4-class attenuation maps were derived by segmentation of CT images of 27 patients undergoing PET/CT examinations into background air, lung, soft-tissue, and fat tissue classes, followed by the assignment of predefined attenuation coefficients to each class. For each patient, 4 PET images were reconstructed: non-TOF and TOF both corrected for attenuation using reference CT-based attenuation correction and the resulting 4-class MRAC maps. The relative errors between non-TOF and TOF MRAC reconstructions were compared with their reference CT-based attenuation correction reconstructions. The bias was locally and globally evaluated using volumes of interest (VOIs) defined on lesions and normal tissues and CT-derived tissue classes containing all voxels in a given tissue, respectively. The impact of TOF on reducing the errors induced by metal-susceptibility and respiratory-phase mismatch artifacts was also evaluated using clinical and simulation studies. Our results show that TOF PET can remarkably reduce attenuation correction artifacts and quantification errors in the lungs and bone tissues. Using classwise analysis, it was found that the non-TOF MRAC method results in an error of -3.4% ± 11.5% in the lungs and -21.8% ± 2.9% in bones, whereas its TOF counterpart reduced the errors to -2.9% ± 7.1% and -15.3% ± 2.3%, respectively. The VOI-based analysis revealed that the non-TOF and TOF methods resulted in an average overestimation of 7.5% and 3.9% in or near lung lesions (n = 23) and underestimation of less than 5% for soft tissue and in or near bone lesions (n = 91). Simulation results showed that as TOF resolution improves, artifacts and quantification errors are substantially reduced. TOF PET substantially reduces artifacts and improves significantly the quantitative accuracy of standard MRAC methods. Therefore, MRAC should be less of a concern on future TOF PET/MR scanners with improved timing resolution. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  2. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  3. Sensitivity and Uncertainty Analysis of Plutonium and Cesium Isotopes in Modeling of BR3 Reactor Spent Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conant, Andrew; Erickson, Anna; Robel, Martin

    Nuclear forensics has a broad task to characterize recovered nuclear or radiological material and interpret the results of investigation. One approach to isotopic characterization of nuclear material obtained from a reactor is to chemically separate and perform isotopic measurements on the sample and verify the results with modeling of the sample history, for example, operation of a nuclear reactor. The major actinide plutonium and fission product cesium are commonly measured signatures of the fuel history in a reactor core. This study investigates the uncertainty of the plutonium and cesium isotope ratios of a fuel rod discharged from a research pressurizedmore » water reactor when the location of the sample is not known a priori. A sensitivity analysis showed overpredicted values for the 240Pu/ 239Pu ratio toward the axial center of the rod and revealed a lower probability of the rod of interest (ROI) being on the periphery of the assembly. The uncertainty analysis found the relative errors due to only the rod position and boron concentration to be 17% to 36% and 7% to 15% for the 240Pu/ 239Pu and 137Cs/ 135Cs ratios, respectively. Lastly, this study provides a method for uncertainty quantification of isotope concentrations due to the location of the ROI. Similar analyses can be performed to verify future chemical and isotopic analyses.« less

  4. Sensitivity and Uncertainty Analysis of Plutonium and Cesium Isotopes in Modeling of BR3 Reactor Spent Fuel

    DOE PAGES

    Conant, Andrew; Erickson, Anna; Robel, Martin; ...

    2017-02-03

    Nuclear forensics has a broad task to characterize recovered nuclear or radiological material and interpret the results of investigation. One approach to isotopic characterization of nuclear material obtained from a reactor is to chemically separate and perform isotopic measurements on the sample and verify the results with modeling of the sample history, for example, operation of a nuclear reactor. The major actinide plutonium and fission product cesium are commonly measured signatures of the fuel history in a reactor core. This study investigates the uncertainty of the plutonium and cesium isotope ratios of a fuel rod discharged from a research pressurizedmore » water reactor when the location of the sample is not known a priori. A sensitivity analysis showed overpredicted values for the 240Pu/ 239Pu ratio toward the axial center of the rod and revealed a lower probability of the rod of interest (ROI) being on the periphery of the assembly. The uncertainty analysis found the relative errors due to only the rod position and boron concentration to be 17% to 36% and 7% to 15% for the 240Pu/ 239Pu and 137Cs/ 135Cs ratios, respectively. Lastly, this study provides a method for uncertainty quantification of isotope concentrations due to the location of the ROI. Similar analyses can be performed to verify future chemical and isotopic analyses.« less

  5. 3D Analysis of HCMV Induced-Nuclear Membrane Structures by FIB/SEM Tomography: Insight into an Unprecedented Membrane Morphology

    PubMed Central

    Villinger, Clarissa; Neusser, Gregor; Kranz, Christine; Walther, Paul; Mertens, Thomas

    2015-01-01

    We show that focused ion beam/scanning electron microscopy (FIB/SEM) tomography is an excellent method to analyze the three-dimensional structure of a fibroblast nucleus infected with human cytomegalovirus (HCMV). We found that the previously described infoldings of the inner nuclear membrane, which are unique among its kind, form an extremely complex network of membrane structures not predictable by previous two-dimensional studies. In all cases they contained further invaginations (2nd and 3rd order infoldings). Quantification revealed 5498 HCMV capsids within two nuclear segments, allowing an estimate of 15,000 to 30,000 capsids in the entire nucleus five days post infection. Only 0.8% proved to be enveloped capsids which were exclusively detected in 1st order infoldings (perinuclear space). Distribution of the capsids between 1st, 2nd and 3rd order infoldings is in complete agreement with the envelopment/de-envelopment model for egress of HCMV capsids from the nucleus and we confirm that capsid budding does occur at the large infoldings. Based on our results we propose the pushing membrane model: HCMV infection induces local disruption of the nuclear lamina and synthesis of new membrane material which is pushed into the nucleoplasm, forming complex membrane infoldings in a highly abundant manner, which then may be also used by nucleocapsids for budding. PMID:26556360

  6. Quantification of trans-1,4-polyisoprene in Eucommia ulmoides by fourier transform infrared spectroscopy and pyrolysis-gas chromatography/mass spectrometry.

    PubMed

    Takeno, Shinya; Bamba, Takeshi; Nakazawa, Yoshihisa; Fukusaki, Eiichiro; Okazawa, Atsushi; Kobayashi, Akio

    2008-04-01

    Commercial development of trans-1,4-polyisoprene from Eucommia ulmoides Oliver (EU-rubber) requires specific knowledge on selection of high-rubber-content lines and establishment of agronomic cultivation methods for achieving maximum EU-rubber yield. The development can be facilitated by high-throughput and highly sensitive analytical techniques for EU-rubber extraction and quantification. In this paper, we described an efficient EU-rubber extraction method, and validated that the accuracy was equivalent to that of the conventional Soxhlet extraction method. We also described a highly sensitive quantification method for EU-rubber by Fourier transform infrared spectroscopy (FT-IR) and pyrolysis-gas chromatography/mass spectrometry (PyGC/MS). We successfully applied the extraction/quantification method for study of seasonal changes in EU-rubber content and molecular weight distribution.

  7. Adjoint-Based Uncertainty Quantification with MCNP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less

  8. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  9. Simultaneous quantification of oil and protein in cottonseed by low-field time-domain nuclear magnetic resonance

    USDA-ARS?s Scientific Manuscript database

    Modification of cottonseed quality traits is likely to be achieved through a combination of genetic modification, manipulation of nutrient allocation and selective breeding. Oil and protein stores comprise the majority of mass of cottonseed embryos. A more comprehensive understanding of the relation...

  10. Non-destructive quantification of oil and protein in cottonseed by time-domain nuclear magnetic resonance

    USDA-ARS?s Scientific Manuscript database

    Modification of cotton seed quality traits is likely to be achieved through a combination of genetic modification, nutrient allocation, and selective breeding. Oil and protein stores comprise the majority of mass of cottonseed embryos. A more comprehensive understanding of the relationship between f...

  11. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    PubMed

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Quantification of the increase in thyroid cancer prevalence in Fukushima after the nuclear disaster in 2011—a potential overdiagnosis?

    PubMed Central

    Katanoda, Kota; Kamo, Ken-Ichi; Tsugane, Shoichiro

    2016-01-01

    A thyroid ultrasound examination programme has been conducted in Fukushima Prefecture, Japan, after the nuclear disaster in 2011. Although remarkably high prevalence of thyroid cancer was observed, no relevant quantitative evaluation was conducted. We calculated the observed/expected (O/E) ratio of thyroid cancer prevalence for the residents aged ≤20 years. Observed prevalence was the number of thyroid cancer cases detected by the programme through the end of April 2015. Expected prevalence was calculated as cumulative incidence by a life-table method using the national estimates of thyroid cancer incidence rate in 2001–10 (prior to the disaster) and the population of Fukushima Prefecture. The underlying assumption was that there was neither nuclear accident nor screening intervention. The observed and estimated prevalence of thyroid cancer among residents aged ≤20 years was 160.1 and 5.2, respectively, giving an O/E ratio of 30.8 [95% confidence interval (CI): 26.2, 35.9]. When the recent increasing trend in thyroid cancer was considered, the overall O/E ratio was 22.2 (95% CI: 18.9, 25.9). The cumulative number of thyroid cancer deaths in Fukushima Prefecture, estimated with the same method (annual average in 2009–13), was 0.6 under age 40. Combined with the existing knowledge about radiation effect on thyroid cancer, our descriptive analysis suggests the possibility of overdiagnosis. Evaluation including individual-level analysis is required to further clarify the contribution of underlying factors. PMID:26755830

  13. A Novel In-Beam Delayed Neutron Counting Technique for Characterization of Special Nuclear Materials

    NASA Astrophysics Data System (ADS)

    Bentoumi, G.; Rogge, R. B.; Andrews, M. T.; Corcoran, E. C.; Dimayuga, I.; Kelly, D. G.; Li, L.; Sur, B.

    2016-12-01

    A delayed neutron counting (DNC) system, where the sample to be analyzed remains stationary in a thermal neutron beam outside of the reactor, has been developed at the National Research Universal (NRU) reactor of the Canadian Nuclear Laboratories (CNL) at Chalk River. The new in-beam DNC is a novel approach for non-destructive characterization of special nuclear materials (SNM) that could enable identification and quantification of fissile isotopes within a large and shielded sample. Despite the orders of magnitude reduction in neutron flux, the in-beam DNC method can be as informative as the conventional in-core DNC for most cases while offering practical advantages and mitigated risk when dealing with large radioactive samples of unknown origin. This paper addresses (1) the qualification of in-beam DNC using a monochromatic thermal neutron beam in conjunction with a proven counting apparatus designed originally for in-core DNC, and (2) application of in-beam DNC to an examination of large sealed capsules containing unknown radioactive materials. Initial results showed that the in-beam DNC setup permits non-destructive analysis of bulky and gamma shielded samples. The method does not lend itself to trace analysis, and at best could only reveal the presence of a few milligrams of 235U via the assay of in-beam DNC total counts. Through analysis of DNC count rates, the technique could be used in combination with other neutron or gamma techniques to quantify isotopes present within samples.

  14. Optimization of a Widefield Structured Illumination Microscope for Non-Destructive Assessment and Quantification of Nuclear Features in Tumor Margins of a Primary Mouse Model of Sarcoma

    PubMed Central

    Fu, Henry L.; Mueller, Jenna L.; Javid, Melodi P.; Mito, Jeffrey K.; Kirsch, David G.; Ramanujam, Nimmi; Brown, J. Quincy

    2013-01-01

    Cancer is associated with specific cellular morphological changes, such as increased nuclear size and crowding from rapidly proliferating cells. In situ tissue imaging using fluorescent stains may be useful for intraoperative detection of residual cancer in surgical tumor margins. We developed a widefield fluorescence structured illumination microscope (SIM) system with a single-shot FOV of 2.1×1.6 mm (3.4 mm2) and sub-cellular resolution (4.4 µm). The objectives of this work were to measure the relationship between illumination pattern frequency and optical sectioning strength and signal-to-noise ratio in turbid (i.e. thick) samples for selection of the optimum frequency, and to determine feasibility for detecting residual cancer on tumor resection margins, using a genetically engineered primary mouse model of sarcoma. The SIM system was tested in tissue mimicking solid phantoms with various scattering levels to determine impact of both turbidity and illumination frequency on two SIM metrics, optical section thickness and modulation depth. To demonstrate preclinical feasibility, ex vivo 50 µm frozen sections and fresh intact thick tissue samples excised from a primary mouse model of sarcoma were stained with acridine orange, which stains cell nuclei, skeletal muscle, and collagenous stroma. The cell nuclei were segmented using a high-pass filter algorithm, which allowed quantification of nuclear density. The results showed that the optimal illumination frequency was 31.7 µm−1 used in conjunction with a 4×0.1 NA objective ( = 0.165). This yielded an optical section thickness of 128 µm and an 8.9×contrast enhancement over uniform illumination. We successfully demonstrated the ability to resolve cell nuclei in situ achieved via SIM, which allowed segmentation of nuclei from heterogeneous tissues in the presence of considerable background fluorescence. Specifically, we demonstrate that optical sectioning of fresh intact thick tissues performed equivalently in regards to nuclear density quantification, to physical frozen sectioning and standard microscopy. PMID:23894357

  15. Interpretation of biological and mechanical variations between the Lowry versus Bradford method for protein quantification.

    PubMed

    Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li

    2010-07-01

    The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.

  16. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    PubMed

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  17. Single-photon emitting radiotracers produced by cyclotrons for myocardial imaging

    NASA Astrophysics Data System (ADS)

    Kulkarni, Padmakar V.

    1989-04-01

    Radionuclides produced by cyclotron have played an important role in clinical nuclear medicine. Among these, 210T1, 123I, 111In and 67Ga in various chemical forms have important applications in the diagnosis of cancer and heart disease using scintigraphic imaging techniques. Cardiac imaging using nuclear scintigraphy and echocardiography has been among the fastest growing diagnostic technologies in medicine during the past 15 years. Development of new tracers in conjunction with new equipment with better resolution has contributed to the better quantification and analysis of test results. The development of new biomolecules, monoclonal antibodies to myosin, platelets, fibrin and other receptor binding agents has added a new dimension to nuclear imaging studies.

  18. Multicenter evaluation of stress-first myocardial perfusion image triage by nuclear technologists and automated quantification

    PubMed Central

    Chaudhry, Waseem; Hussain, Nasir; Ahlberg, Alan W.; Croft, Lori B.; Fernandez, Antonio B.; Parker, Mathew W.; Swales, Heather H.; Slomka, Piotr J.; Henzlova, Milena J.; Duvall, W. Lane

    2016-01-01

    Background A stress-first myocardial perfusion imaging (MPI) protocol saves time, is cost effective, and decreases radiation exposure. A limitation of this protocol is the requirement for physician review of the stress images to determine the need for rest images. This hurdle could be eliminated if an experienced technologist and/or automated computer quantification could make this determination. Methods Images from consecutive patients who were undergoing a stress-first MPI with attenuation correction at two tertiary care medical centers were prospectively reviewed independently by a technologist and cardiologist blinded to clinical and stress test data. Their decision on the need for rest imaging along with automated computer quantification of perfusion results was compared with the clinical reference standard of an assessment of perfusion images by a board-certified nuclear cardiologist that included clinical and stress test data. Results A total of 250 patients (mean age 61 years and 55% female) who underwent a stress-first MPI were studied. According to the clinical reference standard, 42 (16.8%) and 208 (83.2%) stress-first images were interpreted as “needing” and “not needing” rest images, respectively. The technologists correctly classified 229 (91.6%) stress-first images as either “needing” (n = 28) or “not needing” (n = 201) rest images. Their sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were 66.7%, 96.6%, 80.0%, and 93.5%, respectively. An automated stress TPD score ≥1.2 was associated with optimal sensitivity and specificity and correctly classified 179 (71.6%) stress-first images as either “needing” (n = 31) or “not needing” (n = 148) rest images. Its sensitivity, specificity, PPV, and NPV were 73.8%, 71.2%, 34.1%, and 93.1%, respectively. In a model whereby the computer or technologist could correct for the other's incorrect classification, 242 (96.8%) stress-first images were correctly classified. The composite sensitivity, specificity, PPV, and NPV were 83.3%, 99.5%, 97.2%, and 96.7%, respectively. Conclusion Technologists and automated quantification software had a high degree of agreement with the clinical reference standard for determining the need for rest images in a stress-first imaging protocol. Utilizing an experienced technologist and automated systems to screen stress-first images could expand the use of stress-first MPI to sites where the cardiologist is not immediately available for interpretation. PMID:26566774

  19. Parallel Reaction Monitoring: A Targeted Experiment Performed Using High Resolution and High Mass Accuracy Mass Spectrometry

    PubMed Central

    Rauniyar, Navin

    2015-01-01

    The parallel reaction monitoring (PRM) assay has emerged as an alternative method of targeted quantification. The PRM assay is performed in a high resolution and high mass accuracy mode on a mass spectrometer. This review presents the features that make PRM a highly specific and selective method for targeted quantification using quadrupole-Orbitrap hybrid instruments. In addition, this review discusses the label-based and label-free methods of quantification that can be performed with the targeted approach. PMID:26633379

  20. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  1. Quantification of maltol in Korean ginseng (Panax ginseng) products by high-performance liquid chromatography-diode array detector

    PubMed Central

    Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won

    2015-01-01

    Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746

  2. Artifacts Quantification of Metal Implants in MRI

    NASA Astrophysics Data System (ADS)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  3. Prediction of autosomal STR typing success in ancient and Second World War bone samples.

    PubMed

    Zupanič Pajnič, Irena; Zupanc, Tomaž; Balažic, Jože; Geršak, Živa Miriam; Stojković, Oliver; Skadrić, Ivan; Črešnar, Matija

    2017-03-01

    Human-specific quantitative PCR (qPCR) has been developed for forensic use in the last 10 years and is the preferred DNA quantification technique since it is very accurate, sensitive, objective, time-effective and automatable. The amount of information that can be gleaned from a single quantification reaction using commercially available quantification kits has increased from the quantity of nuclear DNA to the amount of male DNA, presence of inhibitors and, most recently, to the degree of DNA degradation. In skeletal remains samples from disaster victims, missing persons and war conflict victims, the DNA is usually degraded. Therefore the new commercial qPCR kits able to assess the degree of degradation are potentially able to predict the success of downstream short tandem repeat (STR) typing. The goal of this study was to verify the quantification step using the PowerQuant kit with regard to its suitability as a screening method for autosomal STR typing success on ancient and Second World War (WWII) skeletal remains. We analysed 60 skeletons excavated from five archaeological sites and four WWII mass graves from Slovenia. The bones were cleaned, surface contamination was removed and the bones ground to a powder. Genomic DNA was obtained from 0.5g of bone powder after total demineralization. The DNA was purified using a Biorobot EZ1 device. Following PowerQuant quantification, DNA samples were subjected to autosomal STR amplification using the NGM kit. Up to 2.51ng DNA/g of powder were extracted. No inhibition was detected in any of bones analysed. 82% of the WWII bones gave full profiles while 73% of the ancient bones gave profiles not suitable for interpretation. Four bone extracts yielded no detectable amplification or zero quantification results and no profiles were obtained from any of them. Full or useful partial profiles were produced only from bone extracts where short autosomal (Auto) and long degradation (Deg) PowerQuant targets were detected. It is concluded that STR typing of old bones after quantification with the PowerQuant should be performed only when both Auto and Deg targets are detected simultaneously with no respect to [Auto]/[Deg] ratio. Prediction of STR typing success could be made according to successful amplification of Deg fragment. The PowerQuant kit is capable of identifying bone DNA samples that will not yield useful STR profiles using the NGM kit, and it can be used as a predictor of autosomal STR typing success of bone extracts obtained from ancient and WWII skeletal remains. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  5. Quantifying NMR relaxation correlation and exchange in articular cartilage with time domain analysis

    NASA Astrophysics Data System (ADS)

    Mailhiot, Sarah E.; Zong, Fangrong; Maneval, James E.; June, Ronald K.; Galvosas, Petrik; Seymour, Joseph D.

    2018-02-01

    Measured nuclear magnetic resonance (NMR) transverse relaxation data in articular cartilage has been shown to be multi-exponential and correlated to the health of the tissue. The observed relaxation rates are dependent on experimental parameters such as solvent, data acquisition methods, data analysis methods, and alignment to the magnetic field. In this study, we show that diffusive exchange occurs in porcine articular cartilage and impacts the observed relaxation rates in T1-T2 correlation experiments. By using time domain analysis of T2-T2 exchange spectroscopy, the diffusive exchange time can be quantified by measurements that use a single mixing time. Measured characteristic times for exchange are commensurate with T1 in this material and so impacts the observed T1 behavior. The approach used here allows for reliable quantification of NMR relaxation behavior in cartilage in the presence of diffusive fluid exchange between two environments.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steven Larson MD

    This project funded since 1986 serves as a core project for cancer research throughout MSKCC, producing key radiotracers as well as basic knowledge about thel physics of radiation decay and imaging, for nuclear medicine applications to cancer diagnosis and therapy. In recent years this research application has broadened to include experiments intended to lead to an improved understanding of cancer biology and into the discovery and testing of new cancer drugs. Advances in immune based radiotargeting form the basis for this project. Both antibody and cellular based immune targeting methods have been explored. The multi-step targeting methodologies (MST) developed bymore » NeoRex (Seattle,Washington), have been adapted for use with positron emitting isotopes and PET allowing the quantification and optimization of targeted delivery. In addition, novel methods for radiolabeling immune T-cells with PET tracers have advanced our ability to track these cells of prolonged period of time.« less

  7. 1H NMR determination of beta-N-methylamino-L-alanine (L-BMAA) in environmental and biological samples.

    PubMed

    Moura, Sidnei; Ultramari, Mariah de Almeida; de Paula, Daniela Mendes Louzada; Yonamine, Mauricio; Pinto, Ernani

    2009-04-01

    A nuclear magnetic resonance (1H NMR) method for the determination of beta-N-methylamino-L-alanine (L-BMAA) in environmental aqueous samples was developed and validated. L-BMAA is a neurotoxic modified amino acid that can be produced by cyanobacteria in aqueous environments. This toxin was extracted from samples by means of solid-phase extraction (SPE) and identified and quantified by 1H NMR without further derivatization steps. The lower limit of quantification (LLOQ) was 5 microg/mL. Good inter and intra-assay precision was also observed (relative standard deviation <8.5%) with the use of 4-nitro-DL-phenylalanine as an internal standard (IS). This method of 1H NMR analysis is not time consuming and can be readily utilized to monitor L-BMAA and confirm its presence in environmental and biological samples.

  8. Final Technical Report on Quantifying Dependability Attributes of Software Based Safety Critical Instrumentation and Control Systems in Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smidts, Carol; Huang, Funqun; Li, Boyuan

    With the current transition from analog to digital instrumentation and control systems in nuclear power plants, the number and variety of software-based systems have significantly increased. The sophisticated nature and increasing complexity of software raises trust in these systems as a significant challenge. The trust placed in a software system is typically termed software dependability. Software dependability analysis faces uncommon challenges since software systems’ characteristics differ from those of hardware systems. The lack of systematic science-based methods for quantifying the dependability attributes in software-based instrumentation as well as control systems in safety critical applications has proved itself to be amore » significant inhibitor to the expanded use of modern digital technology in the nuclear industry. Dependability refers to the ability of a system to deliver a service that can be trusted. Dependability is commonly considered as a general concept that encompasses different attributes, e.g., reliability, safety, security, availability and maintainability. Dependability research has progressed significantly over the last few decades. For example, various assessment models and/or design approaches have been proposed for software reliability, software availability and software maintainability. Advances have also been made to integrate multiple dependability attributes, e.g., integrating security with other dependability attributes, measuring availability and maintainability, modeling reliability and availability, quantifying reliability and security, exploring the dependencies between security and safety and developing integrated analysis models. However, there is still a lack of understanding of the dependencies between various dependability attributes as a whole and of how such dependencies are formed. To address the need for quantification and give a more objective basis to the review process -- therefore reducing regulatory uncertainty -- measures and methods are needed to assess dependability attributes early on, as well as throughout the life-cycle process of software development. In this research, extensive expert opinion elicitation is used to identify the measures and methods for assessing software dependability. Semi-structured questionnaires were designed to elicit expert knowledge. A new notation system, Causal Mechanism Graphing, was developed to extract and represent such knowledge. The Causal Mechanism Graphs were merged, thus, obtaining the consensus knowledge shared by the domain experts. In this report, we focus on how software contributes to dependability. However, software dependability is not discussed separately from the context of systems or socio-technical systems. Specifically, this report focuses on software dependability, reliability, safety, security, availability, and maintainability. Our research was conducted in the sequence of stages found below. Each stage is further examined in its corresponding chapter. Stage 1 (Chapter 2): Elicitation of causal maps describing the dependencies between dependability attributes. These causal maps were constructed using expert opinion elicitation. This chapter describes the expert opinion elicitation process, the questionnaire design, the causal map construction method and the causal maps obtained. Stage 2 (Chapter 3): Elicitation of the causal map describing the occurrence of the event of interest for each dependability attribute. The causal mechanisms for the “event of interest” were extracted for each of the software dependability attributes. The “event of interest” for a dependability attribute is generally considered to be the “attribute failure”, e.g. security failure. The extraction was based on the analysis of expert elicitation results obtained in Stage 1. Stage 3 (Chapter 4): Identification of relevant measurements. Measures for the “events of interest” and their causal mechanisms were obtained from expert opinion elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of “missing” measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.« less

  9. Matrix suppression as a guideline for reliable quantification of peptides by matrix-assisted laser desorption ionization.

    PubMed

    Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo

    2013-09-17

    We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.

  10. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR.

    PubMed

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.

  11. Macular pigment optical density measurements: evaluation of a device using heterochromatic flicker photometry

    PubMed Central

    de Kinkelder, R; van der Veen, R L P; Verbaak, F D; Faber, D J; van Leeuwen, T G; Berendschot, T T J M

    2011-01-01

    Purpose Accurate assessment of the amount of macular pigment (MPOD) is necessary to investigate the role of carotenoids and their assumed protective functions. High repeatability and reliability are important to monitor patients in studies investigating the influence of diet and supplements on MPOD. We evaluated the Macuscope (Macuvision Europe Ltd., Lapworth, Solihull, UK), a recently introduced device for measuring MPOD using the technique of heterochromatic flicker photometry (HFP). We determined agreement with another HFP device (QuantifEye; MPS 9000 series: Tinsley Precision Instruments Ltd., Croydon, Essex, UK) and a fundus reflectance method. Methods The right eyes of 23 healthy subjects (mean age 33.9±15.1 years) were measured. We determined agreement with QuantifEye and correlation with a fundus reflectance method. Repeatability of QuantifEye was assessed in 20 other healthy subjects (mean age 32.1±7.3 years). Repeatability was also compared with measurements by a fundus reflectance method in 10 subjects. Results We found low agreement between test and retest measurements with Macuscope. The average difference and the limits of agreement were −0.041±0.32. We found high agreement between test and retest measurements of QuantifEye (−0.02±0.18) and the fundus reflectance method (−0.04±0.18). MPOD data obtained by Macuscope and QuantifEye showed poor agreement: −0.017±0.44. For Macuscope and the fundus reflectance method, the correlation coefficient was r=0.05 (P=0.83). A significant correlation of r=0.87 (P<0.001) was found between QuantifEye and the fundus reflectance method. Conclusions Because repeatability of Macuscope measurements was low (ie, wide limits of agreement) and MPOD values correlated poorly with the fundus reflectance method, and agreed poorly with QuantifEye, the tested Macuscope protocol seems less suitable for studying MPOD. PMID:21057522

  12. Chiral EFT based nuclear forces: achievements and challenges

    NASA Astrophysics Data System (ADS)

    Machleidt, R.; Sammarruca, F.

    2016-08-01

    During the past two decades, chiral effective field theory has become a popular tool to derive nuclear forces from first principles. Two-nucleon interactions have been worked out up to sixth order of chiral perturbation theory and three-nucleon forces up to fifth order. Applications of some of these forces have been conducted in nuclear few- and many-body systems—with a certain degree of success. But in spite of these achievements, we are still faced with great challenges. Among them is the issue of a proper uncertainty quantification of predictions obtained when applying these forces in ab initio calculations of nuclear structure and reactions. A related problem is the order by order convergence of the chiral expansion. We start this review with a pedagogical introduction and then present the current status of the field of chiral nuclear forces. This is followed by a discussion of representative examples for the application of chiral two- and three-body forces in the nuclear many-body system including convergence issues.

  13. Fluorescent quantification of melanin.

    PubMed

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Methods for quantification of soil-transmitted helminths in environmental media: current techniques and recent advances

    PubMed Central

    Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.

    2015-01-01

    Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788

  15. Creating NDA working standards through high-fidelity spent fuel modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E; Gauld, Ian C; Romano, Catherine E

    2012-01-01

    The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is beingmore » performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from independent calculations performed using SCALE and MCNP. This work is supported by the Next Generation Safeguards Initiative, Office of Nuclear Safeguards and Security, National Nuclear Security Administration.« less

  16. An accurate proteomic quantification method: fluorescence labeling absolute quantification (FLAQ) using multidimensional liquid chromatography and tandem mass spectrometry.

    PubMed

    Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin

    2012-08-01

    A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Analysis of short-chain fatty acids in human feces: A scoping review.

    PubMed

    Primec, Maša; Mičetić-Turk, Dušanka; Langerholc, Tomaž

    2017-06-01

    Short-chain fatty acids (SCFAs) play a crucial role in maintaining homeostasis in humans, therefore the importance of a good and reliable SCFAs analytical detection has raised a lot in the past few years. The aim of this scoping review is to show the trends in the development of different methods of SCFAs analysis in feces, based on the literature published in the last eleven years in all major indexing databases. The search criteria included analytical quantification techniques of SCFAs in different human clinical and in vivo studies. SCFAs analysis is still predominantly performed using gas chromatography (GC), followed by high performance liquid chromatography (HPLC), nuclear magnetic resonance (NMR) and capillary electrophoresis (CE). Performances, drawbacks and advantages of these methods are discussed, especially in the light of choosing a proper pretreatment, as feces is a complex biological material. Further optimization to develop a simple, cost effective and robust method for routine use is needed. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Integration of Dakota into the NEAMS Workbench

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Lefebvre, Robert A.; Langley, Brandon R.

    2017-07-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on integrating Dakota into the NEAMS Workbench. The NEAMS Workbench, developed at Oak Ridge National Laboratory, is a new software framework that provides a graphical user interface, input file creation, parsing, validation, job execution, workflow management, and output processing for a variety of nuclear codes. Dakota is a tool developed at Sandia National Laboratories that provides a suite of uncertainty quantification and optimization algorithms. Providing Dakota within the NEAMS Workbench allows users of nuclear simulation codes to perform uncertainty and optimization studies on their nuclear codes frommore » within a common, integrated environment. Details of the integration and parsing are provided, along with an example of Dakota running a sampling study on the fuels performance code, BISON, from within the NEAMS Workbench.« less

  19. Validation of Non-Invasive Tracer Kinetic Analysis of 18F-Florbetaben PET Using a Dual Time-Window Acquisition Protocol.

    PubMed

    Bullich, Santiago; Barthel, Henryk; Koglin, Norman; Becker, Georg A; De Santi, Susan; Jovalekic, Aleksandar; Stephens, Andrew W; Sabri, Osama

    2017-11-24

    Accurate amyloid PET quantification is necessary for monitoring amyloid-beta accumulation and response to therapy. Currently, most of the studies are analyzed using the static standardized uptake value ratio (SUVR) approach because of its simplicity. However, this approach may be influenced by changes in cerebral blood flow (CBF) or radiotracer clearance. Full tracer kinetic models require arterial blood sampling and dynamic image acquisition. The objectives of this work were: (1) to validate a non-invasive kinetic modeling approach for 18 F-florbetaben PET using an acquisition protocol with the best compromise between quantification accuracy and simplicity and (2) to assess the impact of CBF changes and radiotracer clearance on SUVRs and non-invasive kinetic modeling data in 18 F-florbetaben PET. Methods: Data from twenty subjects (10 patients with probable Alzheimer's dementia/ 10 healthy volunteers) were used to compare the binding potential (BP ND ) obtained from the full kinetic analysis to the SUVR and to non-invasive tracer kinetic methods (simplified reference tissue model (SRTM), and multilinear reference tissue model 2 (MRTM2)). Different approaches using shortened or interrupted acquisitions were compared to the results of the full acquisition (0-140 min). Simulations were carried out to assess the effect of CBF and radiotracer clearance changes on SUVRs and non-invasive kinetic modeling outputs. Results: A 0-30 and 120-140 min dual time-window acquisition protocol using appropriate interpolation of the missing time points provided the best compromise between patient comfort and quantification accuracy. Excellent agreement was found between BP ND obtained using full and dual time-window (2TW) acquisition protocols (BP ND,2TW =0.01+ 1.00 BP ND,FULL , R2=0.97 (MRTM2); BP ND,2TW = 0.05+ 0.92·BP ND,FULL , R2=0.93 (SRTM)). Simulations showed a limited impact of CBF and radiotracer clearance changes on MRTM parameters and SUVRs. Conclusion: This study demonstrates accurate non-invasive kinetic modeling of 18 F-florbetaben PET data using a dual time-window acquisition protocol, thus providing a good compromise between quantification accuracy, scan duration and patient burden. The influence of CBF and radiotracer clearance changes on amyloid-beta load estimates was small. For most clinical research applications, the SUVR approach is appropriate. However, for longitudinal studies in which a maximum quantification accuracy is desired, this non-invasive dual time-window acquisition protocol and kinetic analysis is recommended. Copyright © 2017 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  20. 1,1-dimethylhydrazine as a high purity nitrogen source for MOVPE-water reduction and quantification using nuclear magnetic resonance, gas chromatography-atomic emission detection spectroscopy and cryogenic-mass spectroscopy analytical techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odedra, R.; Smith, L.M.; Rushworth, S.A.

    2000-01-01

    Hydrazine derivatives are attractive low temperature nitrogen sources for use in MOVPE due to their low thermal stability. However their purification and subsequent analysis has not previously been investigated in depth for this application. A detailed study on 1,1-dimethylhydrazine {l{underscore}brace}(CH{sub 3}){sub 2}N-NH{sub 2}{r{underscore}brace} purified by eight different methods and the subsequent quantitative measurements of water present in the samples obtained is reported here. A correlation between {sup 1}H nuclear magnetic resonance spectroscopy (NMR), gas chromatography-atomic emission detection (GC-AED) and cryogenic mass spectroscopy (Cryogenic-MS) has been performed. All three analysis techniques can be used to measure water in the samples andmore » with the best purification the water content can be lowered well below 100 ppm. The high purity of this material has been demonstrated by growth results and the state-of-the-art performance of laser diodes.« less

  1. A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.

    EPA Science Inventory

    John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192).

    We have developed a simple, mild extraction procedure using methanol which, when...

  2. Simultaneous digital quantification and fluorescence-based size characterization of massively parallel sequencing libraries.

    PubMed

    Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H

    2013-08-01

    Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.

  3. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  4. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    PubMed

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Methods for Quantification of Soil-Transmitted Helminths in Environmental Media: Current Techniques and Recent Advances.

    PubMed

    Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V

    2015-12-01

    Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A new era for Nuclear Medicine neuroimaging in Spain: Where do we start from in Spain?

    PubMed

    Balsa, M A; Camacho, V; Garrastachu, P; García-Solís, D; Gómez-Río, M; Rubí, S; Setoain, X; Arbizu, J

    To determine the status of neuroimaging studies of Nuclear Medicine in Spain during 2013 and first quarter of 2014, in order to define the activities of the neuroimaging group of the Spanish Society of Nuclear Medicine and Molecular Imaging (SEMNIM). A questionnaire of 14 questions was designed, divided into 3 parts: characteristics of the departments (equipment and professionals involved); type of scans and clinical indications; and evaluation methods. The questionnaire was sent to 166 Nuclear Medicine departments. A total of 54 departments distributed among all regions completed the questionnaire. Most departments performed between 300 and 800 neuroimaging examinations per year, representing more than 25 scans per month. The average pieces of equipment were three; half of the departments had a PET/CT scanner and SPECT/CT equipment. Scans performed more frequently were brain SPECT with 123 I-FP-CIT, followed by brain perfusion SPECT and PET with 18 F-FDG. The most frequent clinical indications were cognitive impairment followed by movement disorders. For evaluation of the images most sites used only visual assessment, and for the quantitative assessment the most used was quantification by region of interest. These results reflect the clinical activity of 2013 and first quarter of 2014. The main indications of the studies were cognitive impairment and movement disorders. Variability in the evaluation of the studies is among the challenges that will be faced in the coming years. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  7. Review of analytical methods for the quantification of iodine in complex matrices.

    PubMed

    Shelor, C Phillip; Dasgupta, Purnendu K

    2011-09-19

    Iodine is an essential element of human nutrition. Nearly a third of the global population has insufficient iodine intake and is at risk of developing Iodine Deficiency Disorders (IDD). Most countries have iodine supplementation and monitoring programs. Urinary iodide (UI) is the biomarker used for epidemiological studies; only a few methods are currently used routinely for analysis. These methods either require expensive instrumentation with qualified personnel (inductively coupled plasma-mass spectrometry, instrumental nuclear activation analysis) or oxidative sample digestion to remove potential interferences prior to analysis by a kinetic colorimetric method originally introduced by Sandell and Kolthoff ~75 years ago. The Sandell-Kolthoff (S-K) method is based on the catalytic effect of iodide on the reaction between Ce(4+) and As(3+). No available technique fully fits the needs of developing countries; research into inexpensive reliable methods and instrumentation are needed. There have been multiple reviews of methods used for epidemiological studies and specific techniques. However, a general review of iodine determination on a wide-ranging set of complex matrices is not available. While this review is not comprehensive, we cover the principal developments since the original development of the S-K method. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Quantification of mixed chimerism by real time PCR on whole blood-impregnated FTA cards.

    PubMed

    Pezzoli, N; Silvy, M; Woronko, A; Le Treut, T; Lévy-Mozziconacci, A; Reviron, D; Gabert, J; Picard, C

    2007-09-01

    This study has investigated quantification of chimerism in sex-mismatched transplantations by quantitative real time PCR (RQ-PCR) using FTA paper for blood sampling. First, we demonstrate that the quantification of DNA from EDTA-blood which has been deposit on FTA card is accurate and reproducible. Secondly, we show that fraction of recipient cells detected by RQ-PCR was concordant between the FTA and salting-out method, reference DNA extraction method. Furthermore, the sensitivity of detection of recipient cells is relatively similar with the two methods. Our results show that this innovative method can be used for MC assessment by RQ-PCR.

  9. Direct quantification of lipopeptide biosurfactants in biological samples via HPLC and UPLC-MS requires sample modification with an organic solvent.

    PubMed

    Biniarz, Piotr; Łukaszewicz, Marcin

    2017-06-01

    The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.

  10. Data-Independent MS/MS Quantification of Neuropeptides for Determination of Putative Feeding-Related Neurohormones in Microdialysate

    PubMed Central

    2015-01-01

    Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291

  11. Data-independent MS/MS quantification of neuropeptides for determination of putative feeding-related neurohormones in microdialysate.

    PubMed

    Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun

    2015-01-21

    Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.

  12. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. A rapid Fourier-transform infrared (FTIR) spectroscopic method for direct quantification of paracetamol content in solid pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Mallah, Muhammad Ali; Sherazi, Syed Tufail Hussain; Bhanger, Muhammad Iqbal; Mahesar, Sarfaraz Ahmed; Bajeer, Muhammad Ashraf

    2015-04-01

    A transmission FTIR spectroscopic method was developed for direct, inexpensive and fast quantification of paracetamol content in solid pharmaceutical formulations. In this method paracetamol content is directly analyzed without solvent extraction. KBr pellets were formulated for the acquisition of FTIR spectra in transmission mode. Two chemometric models: simple Beer's law and partial least squares employed over the spectral region of 1800-1000 cm-1 for quantification of paracetamol content had a regression coefficient of (R2) of 0.999. The limits of detection and quantification using FTIR spectroscopy were 0.005 mg g-1 and 0.018 mg g-1, respectively. Study for interference was also done to check effect of the excipients. There was no significant interference from the sample matrix. The results obviously showed the sensitivity of transmission FTIR spectroscopic method for pharmaceutical analysis. This method is green in the sense that it does not require large volumes of hazardous solvents or long run times and avoids prior sample preparation.

  14. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    PubMed

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  15. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    PubMed

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  16. Quantification of the increase in thyroid cancer prevalence in Fukushima after the nuclear disaster in 2011--a potential overdiagnosis?

    PubMed

    Katanoda, Kota; Kamo, Ken-Ichi; Tsugane, Shoichiro

    2016-03-01

    A thyroid ultrasound examination programme has been conducted in Fukushima Prefecture, Japan, after the nuclear disaster in 2011. Although remarkably high prevalence of thyroid cancer was observed, no relevant quantitative evaluation was conducted. We calculated the observed/expected (O/E) ratio of thyroid cancer prevalence for the residents aged ≤20 years. Observed prevalence was the number of thyroid cancer cases detected by the programme through the end of April 2015. Expected prevalence was calculated as cumulative incidence by a life-table method using the national estimates of thyroid cancer incidence rate in 2001-10 (prior to the disaster) and the population of Fukushima Prefecture. The underlying assumption was that there was neither nuclear accident nor screening intervention. The observed and estimated prevalence of thyroid cancer among residents aged ≤20 years was 160.1 and 5.2, respectively, giving an O/E ratio of 30.8 [95% confidence interval (CI): 26.2, 35.9]. When the recent increasing trend in thyroid cancer was considered, the overall O/E ratio was 22.2 (95% CI: 18.9, 25.9). The cumulative number of thyroid cancer deaths in Fukushima Prefecture, estimated with the same method (annual average in 2009-13), was 0.6 under age 40. Combined with the existing knowledge about radiation effect on thyroid cancer, our descriptive analysis suggests the possibility of overdiagnosis. Evaluation including individual-level analysis is required to further clarify the contribution of underlying factors. © The Author 2016. Published by Oxford University Press.

  17. Quantified Gamow shell model interaction for p s d -shell nuclei

    NASA Astrophysics Data System (ADS)

    Jaganathen, Y.; Betan, R. M. Id; Michel, N.; Nazarewicz, W.; Płoszajczak, M.

    2017-11-01

    Background: The structure of weakly bound and unbound nuclei close to particle drip lines is one of the major science drivers of nuclear physics. A comprehensive understanding of these systems goes beyond the traditional configuration interaction approach formulated in the Hilbert space of localized states (nuclear shell model) and requires an open quantum system description. The complex-energy Gamow shell model (GSM) provides such a framework as it is capable of describing resonant and nonresonant many-body states on equal footing. Purpose: To make reliable predictions, quality input is needed that allows for the full uncertainty quantification of theoretical results. In this study, we carry out the optimization of an effective GSM (one-body and two-body) interaction in the p s d f -shell-model space. The resulting interaction is expected to describe nuclei with 5 ≤A ≲12 at the p -s d -shell interface. Method: The one-body potential of the 4He core is modeled by a Woods-Saxon + spin-orbit + Coulomb potential, and the finite-range nucleon-nucleon interaction between the valence nucleons consists of central, spin-orbit, tensor, and Coulomb terms. The GSM is used to compute key fit observables. The χ2 optimization is performed using the Gauss-Newton algorithm augmented by the singular value decomposition technique. The resulting covariance matrix enables quantification of statistical errors within the linear regression approach. Results: The optimized one-body potential reproduces nucleon-4He scattering phase shifts up to an excitation energy of 20 MeV. The two-body interaction built on top of the optimized one-body field is adjusted to the bound and unbound ground-state binding energies and selected excited states of the helium, lithium, and beryllium isotopes up to A =9 . A very good agreement with experimental results was obtained for binding energies. First applications of the optimized interaction include predictions for two-nucleon correlation densities and excitation spectra of light nuclei with quantified uncertainties. Conclusion: The new interaction will enable comprehensive and fully quantified studies of structure and reactions aspects of nuclei from the p s d region of the nuclear chart.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaganathen, Y.; Betan, R. M. Id; Michel, N.

    Background: The structure of weakly bound and unbound nuclei close to particle drip lines is one of the major science drivers of nuclear physics. A comprehensive understanding of these systems goes beyond the traditional configuration interaction approach formulated in the Hilbert space of localized states (nuclear shell model) and requires an open quantum system description. The complex-energy Gamow shell model (GSM) provides such a framework as it is capable of describing resonant and nonresonant many-body states on equal footing. Purpose: To make reliable predictions, quality input is needed that allows for the full uncertainty quantification of theoretical results. In thismore » study, we carry out the optimization of an effective GSM (one-body and two-body) interaction in the psdf-shell-model space. The resulting interaction is expected to describe nuclei with 5 ≤ A ≲ 12 at the p-sd-shell interface. Method: The one-body potential of the 4He core is modeled by a Woods-Saxon + spin-orbit + Coulomb potential, and the finite-range nucleon-nucleon interaction between the valence nucleons consists of central, spin-orbit, tensor, and Coulomb terms. The GSM is used to compute key fit observables. The χ 2 optimization is performed using the Gauss-Newton algorithm augmented by the singular value decomposition technique. The resulting covariance matrix enables quantification of statistical errors within the linear regression approach. Results: The optimized one-body potential reproduces nucleon- 4He scattering phase shifts up to an excitation energy of 20 MeV. The two-body interaction built on top of the optimized one-body field is adjusted to the bound and unbound ground-state binding energies and selected excited states of the helium, lithium, and beryllium isotopes up to A = 9 . A very good agreement with experimental results was obtained for binding energies. First applications of the optimized interaction include predictions for two-nucleon correlation densities and excitation spectra of light nuclei with quantified uncertainties. In conclusion: The new interaction will enable comprehensive and fully quantified studies of structure and reactions aspects of nuclei from the psd region of the nuclear chart.« less

  19. Quantified Gamow shell model interaction for p s d -shell nuclei

    DOE PAGES

    Jaganathen, Y.; Betan, R. M. Id; Michel, N.; ...

    2017-11-20

    Background: The structure of weakly bound and unbound nuclei close to particle drip lines is one of the major science drivers of nuclear physics. A comprehensive understanding of these systems goes beyond the traditional configuration interaction approach formulated in the Hilbert space of localized states (nuclear shell model) and requires an open quantum system description. The complex-energy Gamow shell model (GSM) provides such a framework as it is capable of describing resonant and nonresonant many-body states on equal footing. Purpose: To make reliable predictions, quality input is needed that allows for the full uncertainty quantification of theoretical results. In thismore » study, we carry out the optimization of an effective GSM (one-body and two-body) interaction in the psdf-shell-model space. The resulting interaction is expected to describe nuclei with 5 ≤ A ≲ 12 at the p-sd-shell interface. Method: The one-body potential of the 4He core is modeled by a Woods-Saxon + spin-orbit + Coulomb potential, and the finite-range nucleon-nucleon interaction between the valence nucleons consists of central, spin-orbit, tensor, and Coulomb terms. The GSM is used to compute key fit observables. The χ 2 optimization is performed using the Gauss-Newton algorithm augmented by the singular value decomposition technique. The resulting covariance matrix enables quantification of statistical errors within the linear regression approach. Results: The optimized one-body potential reproduces nucleon- 4He scattering phase shifts up to an excitation energy of 20 MeV. The two-body interaction built on top of the optimized one-body field is adjusted to the bound and unbound ground-state binding energies and selected excited states of the helium, lithium, and beryllium isotopes up to A = 9 . A very good agreement with experimental results was obtained for binding energies. First applications of the optimized interaction include predictions for two-nucleon correlation densities and excitation spectra of light nuclei with quantified uncertainties. In conclusion: The new interaction will enable comprehensive and fully quantified studies of structure and reactions aspects of nuclei from the psd region of the nuclear chart.« less

  20. A series of strategies for solving the shortage of reference standards for multi-components determination of traditional Chinese medicine, Mahoniae Caulis as a case.

    PubMed

    Wang, Wenguang; Ma, Xiaoli; Guo, Xiaoyu; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong

    2015-09-18

    In order to solve the bottleneck of reference standards shortage for comprehensive quality control of traditional Chinese medicines (TCMs), a series of strategies, including one single reference standard to determine multi-compounds (SSDMC), quantitative analysis by standardized reference extract (QASRE), and quantitative nuclear magnetic resonance spectroscopy (qNMR) were proposed, and Mahoniae Caulis was selected as an example to develop and validate these methods for simultaneous determination of four alkaloids, columbamine, jatrorrhizine, palmatine, and berberine. Comprehensive comparisons among these methods and with the conventional external standard method (ESM) were carried out. The relative expanded uncertainty of measurement was firstly used to compare their credibility. The results showed that all these three new developed methods can accurately accomplish the quantification by using only one purified reference standard, but each of them has its own advantages and disadvantages as well as the specific application scope, which were also discussed in detail in this paper. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. 78 FR 52898 - Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... DEPARTMENT OF AGRICULTURE [Docket Number: USDA-2013-0003] Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From Agriculture and Forestry Practices AGENCY: Office of the... of Agriculture (USDA) has prepared a report containing methods for quantifying entity-scale...

  2. [DNA quantification of blood samples pre-treated with pyramidon].

    PubMed

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  3. Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS

    PubMed Central

    Chitranshi, Priyanka; da Costa, Gonçalo Gamboa

    2016-01-01

    We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219

  4. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  5. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    PubMed

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  6. New approach for the quantification of processed animal proteins in feed using light microscopy.

    PubMed

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  7. Rapid quantification of inflammation in tissue samples using perfluorocarbon emulsion and fluorine-19 nuclear magnetic resonance

    PubMed Central

    Ahrens, Eric T.; Young, Won-Bin; Xu, Hongyan; Pusateri, Lisa K.

    2016-01-01

    Quantification of inflammation in tissue samples can be a time-intensive bottleneck in therapeutic discovery and preclinical endeavors. We describe a versatile and rapid approach to quantitatively assay macrophage burden in intact tissue samples. Perfluorocarbon (PFC) emulsion is injected intravenously, and the emulsion droplets are effectively taken up by monocytes and macrophages. These ‘in situ’ labeled cells participate in inflammatory events in vivo resulting in PFC accumulation at inflammatory loci. Necropsied tissues or intact organs are subjected to conventional fluorine-19 (19F) NMR spectroscopy to quantify the total fluorine content per sample, proportional to the macrophage burden. We applied these methods to a rat model of experimental allergic encephalomyelitis (EAE) exhibiting extensive inflammation and demyelination in the central nervous system (CNS), particularly in the spinal cord. In a cohort of EAE rats, we used 19F NMR to derive an inflammation index (IFI) in intact CNS tissues. Immunohistochemistry was used to confirm intracellular colocalization of the PFC droplets within CNS CD68+ cells having macrophage morphology. The IFI linearly correlated to mRNA levels of CD68 via real-time PCR analysis. This 19F NMR approach can accelerate tissue analysis by at least an order of magnitude compared with histological approaches. PMID:21548906

  8. Phenotypic feature quantification of patient derived 3D cancer spheroids in fluorescence microscopy image

    NASA Astrophysics Data System (ADS)

    Kang, Mi-Sun; Rhee, Seon-Min; Seo, Ji-Hyun; Kim, Myoung-Hee

    2017-03-01

    Patients' responses to a drug differ at the cellular level. Here, we present an image-based cell phenotypic feature quantification method for predicting the responses of patient-derived glioblastoma cells to a particular drug. We used high-content imaging to understand the features of patient-derived cancer cells. A 3D spheroid culture formation resembles the in vivo environment more closely than 2D adherent cultures do, and it allows for the observation of cellular aggregate characteristics. However, cell analysis at the individual level is more challenging. In this paper, we demonstrate image-based phenotypic screening of the nuclei of patient-derived cancer cells. We first stitched the images of each well of the 384-well plate with the same state. We then used intensity information to detect the colonies. The nuclear intensity and morphological characteristics were used for the segmentation of individual nuclei. Next, we calculated the position of each nucleus that is appeal of the spatial pattern of cells in the well environment. Finally, we compared the results obtained using 3D spheroid culture cells with those obtained using 2D adherent culture cells from the same patient being treated with the same drugs. This technique could be applied for image-based phenotypic screening of cells to determine the patient's response to the drug.

  9. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    PubMed

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification performance was lower for the local database than the research database for both semi-quantitative and machine learning algorithms. However, for both databases, the machine learning methods generated equal or higher mean accuracies (with lower variance) than any of the semi-quantification approaches. The gain in performance from using machine learning algorithms as compared to semi-quantification was relatively small and may be insufficient, when considered in isolation, to offer significant advantages in the clinical context.

  10. Quantification of polyhydroxyalkanoates in mixed and pure cultures biomass by Fourier transform infrared spectroscopy: comparison of different approaches.

    PubMed

    Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J

    2016-08-01

    Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.

  11. Green coffee oil analysis by high-resolution nuclear magnetic resonance spectroscopy.

    PubMed

    D'Amelio, Nicola; De Angelis, Elisabetta; Navarini, Luciano; Schievano, Elisabetta; Mammi, Stefano

    2013-06-15

    In this work, we show how an extensive and fast quantification of the main components in green coffee oil can be achieved by NMR, with minimal sample manipulation and use of organic solvents. The approach is based on the integration of characteristic NMR signals, selected because of their similar relaxation properties and because they fall in similar spectral regions, which minimizes offset effects. Quantification of glycerides, together with their fatty acid components (oleic, linoleic, linolenic and saturated) and minor species (caffeine, cafestol, kahweol and 16-O-methylcafestol), is achieved in less than 1h making use of (1)H and (13)C spectroscopy. The compositional data obtained are in reasonable agreement with classical chromatographic analyses. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. A simple and efficient method for poly-3-hydroxybutyrate quantification in diazotrophic bacteria within 5 minutes using flow cytometry.

    PubMed

    Alves, L P S; Almeida, A T; Cruz, L M; Pedrosa, F O; de Souza, E M; Chubatsu, L S; Müller-Santos, M; Valdameri, G

    2017-01-16

    The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots.

  13. Recommendations for Improving Identification and Quantification in Non-Targeted, GC-MS-Based Metabolomic Profiling of Human Plasma

    PubMed Central

    Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.

    2017-01-01

    The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195

  14. Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist

    NASA Astrophysics Data System (ADS)

    Tummala, Sudhakar; Dam, Erik B.

    2010-03-01

    Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.

  15. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    PubMed

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  16. SIMPLE METHOD FOR THE REPRESENTATION, QUANTIFICATION, AND COMPARISON OF THE VOLUMES AND SHAPES OF CHEMICAL COMPOUNDS

    EPA Science Inventory

    A conceptually and computationally simple method for the definition, display, quantification, and comparison of the shapes of three-dimensional mathematical molecular models is presented. Molecular or solvent-accessible volume and surface area can also be calculated. Algorithms, ...

  17. Development of a screening method for genetically modified soybean by plasmid-based quantitative competitive polymerase chain reaction.

    PubMed

    Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2008-07-23

    A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.

  18. Direct liquid chromatography method for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines.

    PubMed

    Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen

    2011-11-09

    A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.

  19. Comparison of non-invasive assessment of liver fibrosis in patients with alpha1-antitrypsin deficiency using magnetic resonance elastography (MRE), acoustic radiation force impulse (ARFI) Quantification, and 2D-shear wave elastography (2D-SWE).

    PubMed

    Reiter, Rolf; Wetzel, Martin; Hamesch, Karim; Strnad, Pavel; Asbach, Patrick; Haas, Matthias; Siegmund, Britta; Trautwein, Christian; Hamm, Bernd; Klatt, Dieter; Braun, Jürgen; Sack, Ingolf; Tzschätzsch, Heiko

    2018-01-01

    Although it has been known for decades that patients with alpha1-antitrypsin deficiency (AATD) have an increased risk of cirrhosis and hepatocellular carcinoma, limited data exist on non-invasive imaging-based methods for assessing liver fibrosis such as magnetic resonance elastography (MRE) and acoustic radiation force impulse (ARFI) quantification, and no data exist on 2D-shear wave elastography (2D-SWE). Therefore, the purpose of this study is to evaluate and compare the applicability of different elastography methods for the assessment of AATD-related liver fibrosis. Fifteen clinically asymptomatic AATD patients (11 homozygous PiZZ, 4 heterozygous PiMZ) and 16 matched healthy volunteers were examined using MRE and ARFI quantification. Additionally, patients were examined with 2D-SWE. A high correlation is evident for the shear wave speed (SWS) determined with different elastography methods in AATD patients: 2D-SWE/MRE, ARFI quantification/2D-SWE, and ARFI quantification/MRE (R = 0.8587, 0.7425, and 0.6914, respectively; P≤0.0089). Four AATD patients with pathologically increased SWS were consistently identified with all three methods-MRE, ARFI quantification, and 2D-SWE. The high correlation and consistent identification of patients with pathologically increased SWS using MRE, ARFI quantification, and 2D-SWE suggest that elastography has the potential to become a suitable imaging tool for the assessment of AATD-related liver fibrosis. These promising results provide motivation for further investigation of non-invasive assessment of AATD-related liver fibrosis using elastography.

  20. Three-Dimensional Echocardiographic Assessment of Left Heart Chamber Size and Function with Fully Automated Quantification Software in Patients with Atrial Fibrillation.

    PubMed

    Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki

    2016-10-01

    Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  1. Nuclear magnetic resonance and high-performance liquid chromatography techniques for the characterization of bioactive compounds from Humulus lupulus L. (hop).

    PubMed

    Bertelli, Davide; Brighenti, Virginia; Marchetti, Lucia; Reik, Anna; Pellati, Federica

    2018-06-01

    Humulus lupulus L. (hop) represents one of the most cultivated crops, it being a key ingredient in the brewing process. Many health-related properties have been described for hop extracts, making this plant gain more interest in the field of pharmaceutical and nutraceutical research. Among the analytical tools available for the phytochemical characterization of plant extracts, quantitative nuclear magnetic resonance (qNMR) represents a new and powerful technique. In this ambit, the present study was aimed at the development of a new, simple, and efficient qNMR method for the metabolite fingerprinting of bioactive compounds in hop cones, taking advantage of the novel ERETIC 2 tool. To the best of our knowledge, this is the first attempt to apply this method to complex matrices of natural origin, such as hop extracts. The qNMR method set up in this study was applied to the quantification of both prenylflavonoids and bitter acids in eight hop cultivars. The performance of this analytical method was compared with that of HPLC-UV/DAD, which represents the most frequently used technique in the field of natural product analysis. The quantitative data obtained for hop samples by means of the two aforementioned techniques highlighted that the amount of bioactive compounds was slightly higher when qNMR was applied, although the order of magnitude of the values was the same. The accuracy of qNMR was comparable to that of the chromatographic method, thus proving to be a reliable tool for the analysis of these secondary metabolites in hop extracts. Graphical abstract Graphical abstract related to the extraction and analytical methods applied in this work for the analysis of bioactive compounds in Humulus lupulus L. (hop) cones.

  2. Standardless quantification by parameter optimization in electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  3. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  4. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  5. Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.

    PubMed

    Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P

    2012-08-01

    The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.

  6. Overview of the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.

  7. Summary Findings from the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.

  8. [Detection of recombinant-DNA in foods from stacked genetically modified plants].

    PubMed

    Sorokina, E Iu; Chernyshova, O N

    2012-01-01

    A quantitative real-time multiplex polymerase chain reaction method was applied to the detection and quantification of MON863 and MON810 in stacked genetically modified maize MON 810xMON 863. The limit of detection was approximately 0,1%. The accuracy of the quantification, measured as bias from the accepted value and the relative repeatability standard deviation, which measures the intra-laboratory variability, were within 25% at each GM-level. A method verification has demonstrated that the MON 863 and the MON810 methods can be equally applied in quantification of the respective events in stacked MON810xMON 863.

  9. Source separation on hyperspectral cube applied to dermatology

    NASA Astrophysics Data System (ADS)

    Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.

    2010-03-01

    This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.

  10. Nucleocytoplasmic shuttling: the ins and outs of quantitative imaging.

    PubMed

    Molenaar, Chris; Weeks, Kate L

    2018-05-17

    Nucleocytoplasmic protein shuttling is integral to the transmission of signals between the nucleus and the cytoplasm. The nuclear/cytoplasmic distribution of proteins of interest can be determined via fluorescence microscopy, following labelling of the target protein with fluorophore-conjugated antibodies (immunofluorescence) or by tagging the target protein with an autofluorescent protein, such as green fluorescent protein (GFP). The latter enables live cell imaging, a powerful approach that precludes many of the artefacts associated with indirect immunofluorescence in fixed cells. In this review, we discuss important considerations for the design and implementation of fluorescence microscopy experiments to quantify the nuclear/cytoplasmic distribution of a protein of interest. We summarise the pros and cons of detecting endogenous proteins in fixed cells by immunofluorescence and ectopically-expressed fluorescent fusion proteins in living cells. We discuss the suitability of widefield fluorescence microscopy and of 2D, 3D and 4D imaging by confocal microscopy for different applications, and describe two different methods for quantifying the nuclear/cytoplasmic distribution of a protein of interest from the fluorescent signal. Finally, we discuss the importance of eliminating sources of bias and subjectivity during image acquisition and post-imaging analyses. This is critical for the accurate and reliable quantification of nucleocytoplasmic shuttling. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. A method for quantification of exportin-1 (XPO1) occupancy by Selective Inhibitor of Nuclear Export (SINE) compounds

    PubMed Central

    Crochiere, Marsha L.; Baloglu, Erkan; Klebanov, Boris; Donovan, Scott; del Alamo, Diego; Lee, Margaret; Kauffman, Michael; Shacham, Sharon; Landesman, Yosef

    2016-01-01

    Selective Inhibitor of Nuclear Export (SINE) compounds are a family of small-molecules that inhibit nuclear export through covalent binding to cysteine 528 (Cys528) in the cargo-binding pocket of Exportin 1 (XPO1/CRM1) and promote cancer cell death. Selinexor is the lead SINE compound currently in phase I and II clinical trials for advanced solid and hematological malignancies. In an effort to understand selinexor-XPO1 interaction and to establish whether cancer cell response is a function of drug-target engagement, we developed a quantitative XPO1 occupancy assay. Biotinylated leptomycin B (b-LMB) was utilized as a tool compound to measure SINE-free XPO1. Binding to XPO1 was quantitated from SINE compound treated adherent and suspension cells in vitro, dosed ex vivo human peripheral blood mononuclear cells (PBMCs), and PBMCs from mice dosed orally with drug in vivo. Evaluation of a panel of selinexor sensitive and resistant cell lines revealed that resistance was not attributed to XPO1 occupancy by selinexor. Administration of a single dose of selinexor bound XPO1 for minimally 72 hours both in vitro and in vivo. While XPO1 inhibition directly correlates with selinexor pharmacokinetics, the biological outcome of this inhibition depends on modulation of pathways downstream of XPO1, which ultimately determines cancer cell responsiveness. PMID:26654943

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.

    An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less

  13. Novel isotopic N, N-Dimethyl Leucine (iDiLeu) Reagents Enable Absolute Quantification of Peptides and Proteins Using a Standard Curve Approach

    NASA Astrophysics Data System (ADS)

    Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun

    2015-01-01

    Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).

  14. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabharwall, Piyush; Skifton, Richard; Stoots, Carl

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart ofmore » any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.« less

  15. Direct quantification of molar masses of copolymers by online liquid chromatography under critical conditions-nuclear magnetic resonance and size exclusion chromatography-nuclear magnetic resonance.

    PubMed

    Hehn, Mathias; Wagner, Thomas; Hiller, Wolf

    2014-01-07

    Online LCCC-NMR and SEC-NMR are compared regarding the determination of molar masses of block copolymers. Two different direct referencing methods are particularly demonstrated in LCCC-NMR for a detailed characterization of diblock copolymers and their co-monomers. First, an intramolecular reference group was used for the direct determination of block lengths and molar masses. For the first time, it was shown that LCCC-NMR can be used for an accurate determination of Mw and Mn of copolymers. These data were in perfect agreement with SEC-NMR measurements using the same intramolecular referencing method. In contrast, the determination of molar masses with common relative methods based on calibrations with homopolymers delivered inaccurate results for all investigated diblock copolymers due to different hydrodynamic volumes of the diblock copolymer compared to their homopolymers. The intramolecular referencing method provided detailed insights in the co-monomer behavior during the chromatographic separation of LCCC. Especially, accurate chain lengths and chemical compositions of the "invisible" and "visible" blocks were quantified during the elution under critical conditions and provided new aspects to the concept of critical conditions. Second, an external reference NMR signal was used to directly determine concentrations and molar masses of the block copolymers from the chromatographic elution profile. Consequently, the intensity axes of the resulting chromatograms were converted to molar amounts and masses, allowing for determination of the amount of polymer chains with respect to elution volume, the evaluation of the limiting magnitude of concentration for LCCC-NMR, and determination of the molar masses of copolymers.

  16. Quantification of methionine and selenomethionine in biological samples using multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS).

    PubMed

    Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel

    2018-05-01

    Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.

  17. Capillary electrophoresis with contactless conductivity detection for the quantification of fluoride in lithium ion battery electrolytes and in ionic liquids-A comparison to the results gained with a fluoride ion-selective electrode.

    PubMed

    Pyschik, Marcelina; Klein-Hitpaß, Marcel; Girod, Sabrina; Winter, Martin; Nowak, Sascha

    2017-02-01

    In this study, an optimized method using capillary electrophoresis (CE) with a direct contactless conductivity detector (C 4 D) for a new application field is presented for the quantification of fluoride in common used lithium ion battery (LIB) electrolyte using LiPF 6 in organic carbonate solvents and in ionic liquids (ILs) after contacted to Li metal. The method development for finding the right buffer and the suitable CE conditions for the quantification of fluoride was investigated. The results of the concentration of fluoride in different LIB electrolyte samples were compared to the results from the ion-selective electrode (ISE). The relative standard deviations (RSDs) and recovery rates for fluoride were obtained with a very high accuracy in both methods. The results of the fluoride concentration in the LIB electrolytes were in very good agreement for both methods. In addition, the limit of detection (LOD) and limit of quantification (LOQ) values were determined for the CE method. The CE method has been applied also for the quantification of fluoride in ILs. In the fresh IL sample, the concentration of fluoride was under the LOD. Another sample of the IL mixed with Li metal has been investigated as well. It was possible to quantify the fluoride concentration in this sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  19. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  20. A validated ultra high pressure liquid chromatographic method for qualification and quantification of folic acid in pharmaceutical preparations.

    PubMed

    Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J

    2011-04-05

    A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. A simple and efficient method for poly-3-hydroxybutyrate quantification in diazotrophic bacteria within 5 minutes using flow cytometry

    PubMed Central

    Alves, L.P.S.; Almeida, A.T.; Cruz, L.M.; Pedrosa, F.O.; de Souza, E.M.; Chubatsu, L.S.; Müller-Santos, M.; Valdameri, G.

    2017-01-01

    The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots. PMID:28099582

  2. Separation and quantification of monothiols and phytochelatins from a wide variety of cell cultures and tissues of trees and other plants using high performance liquid chromatography

    Treesearch

    Rakesh Minocha; P. Thangavel; Om Parkash Dhankher; Stephanie Long

    2008-01-01

    The HPLC method presented here for the quantification of metal-binding thiols is considerably shorter than most previously published methods. It is a sensitive and highly reproducible method that separates monobromobimane tagged monothiols (cysteine, glutathione, γ-glutamylcysteine) along with polythiols (PC2, PC3...

  3. An alternative method for irones quantification in iris rhizomes using headspace solid-phase microextraction.

    PubMed

    Roger, B; Fernandez, X; Jeannot, V; Chahboun, J

    2010-01-01

    The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.

  4. Quantification of immobilized Candida antarctica lipase B (CALB) using ICP-AES combined with Bradford method.

    PubMed

    Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L

    2017-02-01

    The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Comparison of biochemical and microscopic methods for quantification of mycorrhizal fungi in soil and roots

    USDA-ARS?s Scientific Manuscript database

    Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...

  6. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING LIGHT TRANSMISSION VISUALIZATION METHOD

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  7. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...

  8. Thermal, High Pressure, and Electric Field Processing Effects on Plant Cell Membrane Integrity and Relevance to Fruit and Vegetable Quality

    PubMed Central

    Gonzalez, Maria E; Barrett, Diane M

    2010-01-01

    Advanced food processing methods that accomplish inactivation of microorganisms but minimize adverse thermal exposure are of great interest to the food industry. High pressure (HP) and pulsed electric field (PEF) processing are commercially applied to produce high quality fruit and vegetable products in the United States, Europe, and Japan. Both microbial and plant cell membranes are significantly altered following exposure to heat, HP, or PEF. Our research group sought to quantify the degree of damage to plant cell membranes that occurs as a result of exposure to heat, HP, or PEF, using the same analytical methods. In order to evaluate whether new advanced processing methods are superior to traditional thermal processing methods, it is necessary to compare them. In this review, we describe the existing state of knowledge related to effects of heat, HP, and PEF on both microbial and plant cells. The importance and relevance of compartmentalization in plant cells as it relates to fruit and vegetable quality is described and various methods for quantification of plant cell membrane integrity are discussed. These include electrolyte leakage, cell viability, and proton nuclear magnetic resonance (1H-NMR). PMID:20492210

  9. Thermal, high pressure, and electric field processing effects on plant cell membrane integrity and relevance to fruit and vegetable quality.

    PubMed

    Gonzalez, Maria E; Barrett, Diane M

    2010-09-01

    Advanced food processing methods that accomplish inactivation of microorganisms but minimize adverse thermal exposure are of great interest to the food industry. High pressure (HP) and pulsed electric field (PEF) processing are commercially applied to produce high quality fruit and vegetable products in the United States, Europe, and Japan. Both microbial and plant cell membranes are significantly altered following exposure to heat, HP, or PEF. Our research group sought to quantify the degree of damage to plant cell membranes that occurs as a result of exposure to heat, HP, or PEF, using the same analytical methods. In order to evaluate whether new advanced processing methods are superior to traditional thermal processing methods, it is necessary to compare them. In this review, we describe the existing state of knowledge related to effects of heat, HP, and PEF on both microbial and plant cells. The importance and relevance of compartmentalization in plant cells as it relates to fruit and vegetable quality is described and various methods for quantification of plant cell membrane integrity are discussed. These include electrolyte leakage, cell viability, and proton nuclear magnetic resonance (¹H-NMR).

  10. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  11. Novel Quantitative Real-Time LCR for the Sensitive Detection of SNP Frequencies in Pooled DNA: Method Development, Evaluation and Application

    PubMed Central

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-01

    Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808

  12. Quantification of Kryptofix 2.2.2 in [18F]fluorine-labelled radiopharmaceuticals by rapid-resolution liquid chromatography.

    PubMed

    Lao, Yexing; Yang, Cuiping; Zou, Wei; Gan, Manquan; Chen, Ping; Su, Weiwei

    2012-05-01

    The cryptand Kryptofix 2.2.2 is used extensively as a phase-transfer reagent in the preparation of [18F]fluoride-labelled radiopharmaceuticals. However, it has considerable acute toxicity. The aim of this study was to develop and validate a method for rapid (within 1 min), specific and sensitive quantification of Kryptofix 2.2.2 at trace levels. Chromatographic separations were carried out by rapid-resolution liquid chromatography (Agilent ZORBAX SB-C18 rapid-resolution column, 2.1 × 30 mm, 3.5 μm). Tandem mass spectra were acquired using a triple quadrupole mass spectrometer equipped with an electrospray ionization interface. Quantitative mass spectrometric analysis was conducted in positive ion mode and multiple reaction monitoring mode for the m/z 377.3 → 114.1 transition for Kryptofix 2.2.2. The external standard method was used for quantification. The method met the precision and efficiency requirements for PET radiopharmaceuticals, providing satisfactory results for specificity, matrix effect, stability, linearity (0.5-100 ng/ml, r(2)=0.9975), precision (coefficient of variation < 5%), accuracy (relative error < ± 3%), sensitivity (lower limit of quantification=0.5 ng) and detection time (<1 min). Fluorodeoxyglucose (n=6) was analysed, and the Kryptofix 2.2.2 content was found to be well below the maximum permissible levels approved by the US Food and Drug Administration. The developed method has a short analysis time (<1 min) and high sensitivity (lower limit of quantification=0.5 ng/ml) and can be successfully applied to rapid quantification of Kryptofix 2.2.2 at trace levels in fluorodeoxyglucose. This method could also be applied to other [18F]fluorine-labelled radiopharmaceuticals that use Kryptofix 2.2.2 as a phase-transfer reagent.

  13. Novel approach in k0-NAA for highly concentrated REE Samples.

    PubMed

    Abdollahi Neisiani, M; Latifi, M; Chaouki, J; Chilian, C

    2018-04-01

    The present paper presents a new approach for k 0 -NAA for accurate quantification with short turnaround analysis times for rare earth elements (REEs) in high content mineral matrices. REE k 0 and Q 0 values, spectral interferences and nuclear interferences were experimentally evaluated and improved with Alfa Aesar Specpure Plasma Standard 1000mgkg -1 mono-rare earth solutions. The new iterative gamma-ray self-attenuation and neutron self-shielding methods were investigated with powder standards prepared from 100mg of 99.9% Alfa Aesar mono rare earth oxide diluted with silica oxide. The overall performance of the new k 0 -NAA method for REEs was validated using a certified reference material (CRM) from Canadian Certified Reference Materials Project (REE-2) with REE content ranging from 7.2mgkg -1 for Yb to 9610mgkg -1 for Ce. The REE concentration was determined with uncertainty below 7% (at 95% confidence level) and proved good consistency with the CRM certified concentrations. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Methods in Clinical Pharmacology Series

    PubMed Central

    Beaumont, Claire; Young, Graeme C; Cavalier, Tom; Young, Malcolm A

    2014-01-01

    Human radiolabel studies are traditionally conducted to provide a definitive understanding of the human absorption, distribution, metabolism and excretion (ADME) properties of a drug. However, advances in technology over the past decade have allowed alternative methods to be employed to obtain both clinical ADME and pharmacokinetic (PK) information. These include microdose and microtracer approaches using accelerator mass spectrometry, and the identification and quantification of metabolites in samples from classical human PK studies using technologies suitable for non-radiolabelled drug molecules, namely liquid chromatography-mass spectrometry and nuclear magnetic resonance spectroscopy. These recently developed approaches are described here together with relevant examples primarily from experiences gained in support of drug development projects at GlaxoSmithKline. The advantages of these study designs together with their limitations are described. We also discuss special considerations which should be made for a successful outcome to these new approaches and also to the more traditional human radiolabel study in order to maximize knowledge around the human ADME properties of drug molecules. PMID:25041729

  15. Scalable Methods for Uncertainty Quantification, Data Assimilation and Target Accuracy Assessment for Multi-Physics Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Khuwaileh, Bassam

    High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL) based algorithm previously developed to quantify the uncertainty for single physics models is extended for large scale multi-physics coupled problems with feedback effect. Moreover, a non-linear surrogate based UQ approach is developed, used and compared to performance of the KL approach and brute force Monte Carlo (MC) approach. On the other hand, an efficient Data Assimilation (DA) algorithm is developed to assess information about model's parameters: nuclear data cross-sections and thermal-hydraulics parameters. Two improvements are introduced in order to perform DA on the high dimensional problems. First, a goal-oriented surrogate model can be used to replace the original models in the depletion sequence (MPACT -- COBRA-TF - ORIGEN). Second, approximating the complex and high dimensional solution space with a lower dimensional subspace makes the sampling process necessary for DA possible for high dimensional problems. Moreover, safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. Accordingly, an inverse problem can be defined and solved to assess the contributions from sources of uncertainty; and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this dissertation a subspace-based gradient-free and nonlinear algorithm for inverse uncertainty quantification namely the Target Accuracy Assessment (TAA) has been developed and tested. The ideas proposed in this dissertation were first validated using lattice physics applications simulated using SCALE6.1 package (Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) lattice models). Ultimately, the algorithms proposed her were applied to perform UQ and DA for assembly level (CASL progression problem number 6) and core wide problems representing Watts Bar Nuclear 1 (WBN1) for cycle 1 of depletion (CASL Progression Problem Number 9) modeled via simulated using VERA-CS which consists of several multi-physics coupled models. The analysis and algorithms developed in this dissertation were encoded and implemented in a newly developed tool kit algorithms for Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE).

  16. A phytochemical comparison of saw palmetto products using gas chromatography and 1H nuclear magnetic resonance spectroscopy metabolomic profiling

    PubMed Central

    Booker, Anthony; Suter, Andy; Krnjic, Ana; Strassel, Brigitte; Zloh, Mire; Said, Mazlina; Heinrich, Michael

    2014-01-01

    Objectives Preparations containing saw palmetto berries are used in the treatment of benign prostatic hyperplasia (BPH). There are many products on the market, and relatively little is known about their chemical variability and specifically the composition and quality of different saw palmetto products notwithstanding that in 2000, an international consultation paper from the major urological associations from the five continents on treatments for BPH demanded further research on this topic. Here, we compare two analytical approaches and characterise 57 different saw palmetto products. Methods An established method – gas chromatography – was used for the quantification of nine fatty acids, while a novel approach of metabolomic profiling using 1H nuclear magnetic resonance (NMR) spectroscopy was used as a fingerprinting tool to assess the overall composition of the extracts. Key findings The phytochemical analysis determining the fatty acids showed a high level of heterogeneity of the different products in the total amount and of nine single fatty acids. A robust and reproducible 1H NMR spectroscopy method was established, and the results showed that it was possible to statistically differentiate between saw palmetto products that had been extracted under different conditions but not between products that used a similar extraction method. Principal component analysis was able to determine those products that had significantly different metabolites. Conclusions The metabolomic approach developed offers novel opportunities for quality control along the value chain of saw palmetto and needs to be followed further, as with this method, the complexity of a herbal extract can be better assessed than with the analysis of a single group of constituents. PMID:24417505

  17. BATMAN--an R package for the automated quantification of metabolites from nuclear magnetic resonance spectra using a Bayesian model.

    PubMed

    Hao, Jie; Astle, William; De Iorio, Maria; Ebbels, Timothy M D

    2012-08-01

    Nuclear Magnetic Resonance (NMR) spectra are widely used in metabolomics to obtain metabolite profiles in complex biological mixtures. Common methods used to assign and estimate concentrations of metabolites involve either an expert manual peak fitting or extra pre-processing steps, such as peak alignment and binning. Peak fitting is very time consuming and is subject to human error. Conversely, alignment and binning can introduce artefacts and limit immediate biological interpretation of models. We present the Bayesian automated metabolite analyser for NMR spectra (BATMAN), an R package that deconvolutes peaks from one-dimensional NMR spectra, automatically assigns them to specific metabolites from a target list and obtains concentration estimates. The Bayesian model incorporates information on characteristic peak patterns of metabolites and is able to account for shifts in the position of peaks commonly seen in NMR spectra of biological samples. It applies a Markov chain Monte Carlo algorithm to sample from a joint posterior distribution of the model parameters and obtains concentration estimates with reduced error compared with conventional numerical integration and comparable to manual deconvolution by experienced spectroscopists. http://www1.imperial.ac.uk/medicine/people/t.ebbels/ t.ebbels@imperial.ac.uk.

  18. Coupled Segmentation of Nuclear and Membrane-bound Macromolecules through Voting and Multiphase Level Set

    PubMed Central

    Wen, Quan

    2014-01-01

    Membrane-bound macromolecules play an important role in tissue architecture and cell-cell communication, and is regulated by almost one-third of the genome. At the optical scale, one group of membrane proteins expresses themselves as linear structures along the cell surface boundaries, while others are sequestered; and this paper targets the former group. Segmentation of these membrane proteins on a cell-by-cell basis enables the quantitative assessment of localization for comparative analysis. However, such membrane proteins typically lack continuity, and their intensity distributions are often very heterogeneous; moreover, nuclei can form large clump, which further impedes the quantification of membrane signals on a cell-by-cell basis. To tackle these problems, we introduce a three-step process to (i) regularize the membrane signal through iterative tangential voting, (ii) constrain the location of surface proteins by nuclear features, where clumps of nuclei are segmented through a delaunay triangulation approach, and (iii) assign membrane-bound macromolecules to individual cells through an application of multi-phase geodesic level-set. We have validated our method using both synthetic data and a dataset of 200 images, and are able to demonstrate the efficacy of our approach with superior performance. PMID:25530633

  19. Bayesian forecasting and uncertainty quantifying of stream flows using Metropolis-Hastings Markov Chain Monte Carlo algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen

    2017-06-01

    This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.

  20. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  1. Establishing Ion Ratio Thresholds Based on Absolute Peak Area for Absolute Protein Quantification using Protein Cleavage Isotope Dilution Mass Spectrometry

    PubMed Central

    Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.

    2014-01-01

    Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C. S.; Zhang, Hongbin

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  3. A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics Problems

    DTIC Science & Technology

    2014-04-01

    Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Holst, Submitted for publication ...TR-14-33 A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics...Problems Approved for public release, distribution is unlimited. April 2014 HDTRA1-09-1-0036 Donald Estep and Michael

  4. Simple, Fast, and Sensitive Method for Quantification of Tellurite in Culture Media▿

    PubMed Central

    Molina, Roberto C.; Burra, Radhika; Pérez-Donoso, José M.; Elías, Alex O.; Muñoz, Claudia; Montes, Rebecca A.; Chasteen, Thomas G.; Vásquez, Claudio C.

    2010-01-01

    A fast, simple, and reliable chemical method for tellurite quantification is described. The procedure is based on the NaBH4-mediated reduction of TeO32− followed by the spectrophotometric determination of elemental tellurium in solution. The method is highly reproducible, is stable at different pH values, and exhibits linearity over a broad range of tellurite concentrations. PMID:20525868

  5. Automated renal histopathology: digital extraction and quantification of renal pathology

    NASA Astrophysics Data System (ADS)

    Sarder, Pinaki; Ginley, Brandon; Tomaszewski, John E.

    2016-03-01

    The branch of pathology concerned with excess blood serum proteins being excreted in the urine pays particular attention to the glomerulus, a small intertwined bunch of capillaries located at the beginning of the nephron. Normal glomeruli allow moderate amount of blood proteins to be filtered; proteinuric glomeruli allow large amount of blood proteins to be filtered. Diagnosis of proteinuric diseases requires time intensive manual examination of the structural compartments of the glomerulus from renal biopsies. Pathological examination includes cellularity of individual compartments, Bowman's and luminal space segmentation, cellular morphology, glomerular volume, capillary morphology, and more. Long examination times may lead to increased diagnosis time and/or lead to reduced precision of the diagnostic process. Automatic quantification holds strong potential to reduce renal diagnostic time. We have developed a computational pipeline capable of automatically segmenting relevant features from renal biopsies. Our method first segments glomerular compartments from renal biopsies by isolating regions with high nuclear density. Gabor texture segmentation is used to accurately define glomerular boundaries. Bowman's and luminal spaces are segmented using morphological operators. Nuclei structures are segmented using color deconvolution, morphological processing, and bottleneck detection. Average computation time of feature extraction for a typical biopsy, comprising of ~12 glomeruli, is ˜69 s using an Intel(R) Core(TM) i7-4790 CPU, and is ~65X faster than manual processing. Using images from rat renal tissue samples, automatic glomerular structural feature estimation was reproducibly demonstrated for 15 biopsy images, which contained 148 individual glomeruli images. The proposed method holds immense potential to enhance information available while making clinical diagnoses.

  6. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    PubMed

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Rapid Development and Validation of Improved Reversed-Phase High-performance Liquid Chromatography Method for the Quantification of Mangiferin, a Polyphenol Xanthone Glycoside in Mangifera indica

    PubMed Central

    Naveen, P.; Lingaraju, H. B.; Prasad, K. Shyam

    2017-01-01

    Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica, is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica. RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography–mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica. SUMMARY The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica. The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica. Abbreviations Used: M. indica: Mangifera indica, RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification. PMID:28539748

  8. Rapid Development and Validation of Improved Reversed-Phase High-performance Liquid Chromatography Method for the Quantification of Mangiferin, a Polyphenol Xanthone Glycoside in Mangifera indica.

    PubMed

    Naveen, P; Lingaraju, H B; Prasad, K Shyam

    2017-01-01

    Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica , is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica . RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography-mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica . The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica . The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica . Abbreviations Used: M. indica : Mangifera indica , RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haudebourg, Raphael; Fichet, Pascal; Goutelard, Florence

    The detection (location and quantification) of nuclear facilities to be dismantled possible contamination with low-range particles emitters ({sup 3}H, other low-energy β emitters, a emitters) remains a tedious and expensive task. Indeed, usual remote counters show a too low sensitivity to these non-penetrating radiations, while conventional wipe tests are irrelevant for fixed radioactivity evaluation. The only method to accurately measure activity levels consists in sampling and running advanced laboratory analyses (spectroscopy, liquid scintillation counting, pyrolysis...). Such measurements generally induce sample preparation, waste production (destructive analyses, solvents), nuclear material transportation, long durations, and significant labor mobilization. Therefore, the search for themore » limitation of their number and cost easily conflicts with the necessity to perform a dense screening for sampling (to maximize the representativeness of the samples), in installations of thousands of square meters (floors, wells, ceilings), plus furniture, pipes, and other wastes. To overcome this contradiction, Digital Autoradiography (D. A.) was re-routed from bio molecular research to radiological mapping of nuclear installations under dismantling and to waste and sample analysis. After in-situ exposure to the possibly-contaminated areas to investigate, commercial reusable radiosensitive phosphor screens (of a few 100 cm{sup 2}) were scanned in the proper laboratory device and sharp quantitative images of the radioactivity could be obtained. The implementation of geostatistical tools in the data processing software enabled the exhaustive characterization of concrete floors at a rate of 2 weeks / 100 m{sup 2}, at lowest costs. Various samples such as drilled cores, or tank and wood pieces, were also successfully evaluated with this method, for decisive results. Thanks to the accurate location of potential contamination spots, this approach ensures relevant and representative sampling for further laboratory analyses and should be inserted in the range of common tools used in dismantling. (authors)« less

  10. Robust Online Monitoring for Calibration Assessment of Transmitters and Instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Coble, Jamie B.; Shumaker, Brent

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this article, we discuss an overview of research being performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or moremore » sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation • Virtual sensing • Sensor response-time assessment These algorithms incorporate, at their base, a Gaussian Process-based uncertainty quantification (UQ) method. Various plant models (using kernel regression, GP, or hierarchical models) may be used to predict sensor responses under various plant conditions. These predicted responses can then be applied in fault detection (sensor output and response time) and in computing the correct value (virtual sensing) of a failing physical sensor. The methods being evaluated in this work can compute confidence levels along with the predicted sensor responses, and as a result, may have the potential for compensating for sensor drift in real-time (online recalibration). Evaluation was conducted using data from multiple sources (laboratory flow loops and plant data). Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  11. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Blanford; E. Keldrauk; M. Laufer

    2010-09-20

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement,more » and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using factory prefabricated structural modules, for application to external event shell and base isolated structures.« less

  12. LaBr3γ-ray spectrometer for detecting 10B in debris of melted nuclear fuel

    NASA Astrophysics Data System (ADS)

    Koizumi, Mitsuo; Tsuchiya, Harufumi; Kitatani, Fumito; Harada, Hideo; Heyse, Jan; Kopecky, Stefan; Mondelaers, Willy; Paradela, Carlos; Schillebeeckx, Peter

    2016-11-01

    Neutron resonance densitometry has been proposed as a nondestructive analytical method for quantifying special nuclear material (SNM) in the rock- and particle-like debris that is to be removed from the Fukushima Daiichi nuclear power plant. The method is based on neutron resonance transmission analysis (NRTA) and neutron resonance capture analysis combined with prompt-γ-ray analysis (NRCA/PGA). Although quantification of SNM will predominantly rely on NRTA, this will be hampered by the presence of strong neutron-absorbing matrix materials, in particular 10B. Results obtained with NRCA/PGA are used to improve the interpretation of NRTA data. Prompt γ rays originating from the 10B(n, αγ) reaction are used to assess the amount of 10B. The 478 keV γ rays from 10B, however, need to be measured under a high-radiation environment, especially because of 137Cs. To meet this requirement, we developed a well-shaped γ-ray spectrometer consisting of one cylindrical and four rectangular-cuboid LaBr3 scintillators combined with a fast data-acquisition system. Furthermore, to improve the gain stability of the main detector, a special high-voltage divider was developed. Because of the reduction in gain shift, a 3.8% resolution at 662 keV was obtained for long-term measurements. By using the data-acquisition system, which consists of eight 250 MHz digitizers, input signals of over 500 kHz per channel were recorded. The work reported herein demonstrates that, with such a spectrometer, the impact of the Compton edge of 662 keV γ rays from 137Cs is significantly reduced, which allows the 10B amount to be determined with greater sensitivity.

  13. Itô-SDE MCMC method for Bayesian characterization of errors associated with data limitations in stochastic expansion methods for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.

    2017-11-01

    This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.

  14. Single photon emission computed tomography and oth selected computer topics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frey, G.D.

    1981-07-01

    This book, the proceedings of a meeting in January 1980, contains 21 papers. Thirteen are devoted to aspects of emission tomography, four to nuclear cardiology, and five to other topics. The initial set of papers consists of reviews of the single photon emission tomography process. These include transverse axial tomography using scintillation cameras and other devices, longitudinal section tomography, and pin-hole and slant-hole systems. These reviews are generally well done, but as might be expected, lack any coherence from paper to paper. The papers on nuclear cardiology include several of Fourier analysis in nuclear cardiology and one on shunt quantification.more » Other clinical papers are on quantifying Tc-99m glucoheptonate uptake in the brain and on iron-59 retention studies. A general criticism of the book is the poor quality of photographic reproductions.« less

  15. Quantification of fungicides in snow-melt runoff from turf: A comparison of four extraction methods

    USDA-ARS?s Scientific Manuscript database

    A variety of pesticides are used to control diverse stressors to turf. These pesticides have a wide range in physical and chemical properties. The objective of this project was to develop an extraction and analysis method for quantification of chlorothalonil and PCNB (pentachloronitrobenzene), two p...

  16. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 3

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  17. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 2

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  18. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 1

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  19. Miniature fiber optic spectrometer-based quantitative fluorescence resonance energy transfer measurement in single living cells.

    PubMed

    Chai, Liuying; Zhang, Jianwei; Zhang, Lili; Chen, Tongsheng

    2015-03-01

    Spectral measurement of fluorescence resonance energy transfer (FRET), spFRET, is a widely used FRET quantification method in living cells today. We set up a spectrometer-microscope platform that consists of a miniature fiber optic spectrometer and a widefield fluorescence microscope for the spectral measurement of absolute FRET efficiency (E) and acceptor-to-donor concentration ratio (R(C)) in single living cells. The microscope was used for guiding cells and the spectra were simultaneously detected by the miniature fiber optic spectrometer. Moreover, our platform has independent excitation and emission controllers, so different excitations can share the same emission channel. In addition, we developed a modified spectral FRET quantification method (mlux-FRET) for the multiple donors and multiple acceptors FRET construct (mD∼nA) sample, and we also developed a spectra-based 2-channel acceptor-sensitized FRET quantification method (spE-FRET). We implemented these modified FRET quantification methods on our platform to measure the absolute E and R(C) values of tandem constructs with different acceptor/donor stoichiometries in single living Huh-7 cells.

  20. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry.

  1. A novel liquid chromatography-tandem mass spectrometry method for determination of menadione in human plasma after derivatization with 3-mercaptopropionic acid.

    PubMed

    Liu, Ruijuan; Wang, Mengmeng; Ding, Li

    2014-10-01

    Menadione (VK3), an essential fat-soluble naphthoquinone, takes very important physiological and pathological roles, but its detection and quantification is challenging. Herein, a new method was developed for quantification of VK3 in human plasma by liquid chromatography-tandem mass spectrometry (LC-MS/MS) after derivatization with 3-mercaptopropionic acid via Michael addition reaction. The derivative had been identified by the mass spectra and the derivatization conditions were optimized by considering different parameters. The method was demonstrated with high sensitivity and a low limit of quantification of 0.03 ng mL(-1) for VK3, which is about 33-fold better than that for the direct analysis of the underivatized compound. The method also had good precision and reproducibility. It was applied in the determination of basal VK3 in human plasma and a clinical pharmacokinetic study of menadiol sodium diphosphate. Furthermore, the method for the quantification of VK3 using LC-MS/MS was reported in this paper for the first time, and it will provide an important strategy for the further research on VK3 and menadione analogs. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    PubMed

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Utility of magnetic resonance imaging and nuclear magnetic resonance-based metabolomics for quantification of inflammatory lung injury

    PubMed Central

    Serkova, Natalie J.; Van Rheen, Zachary; Tobias, Meghan; Pitzer, Joshua E.; Wilkinson, J. Erby; Stringer, Kathleen A.

    2008-01-01

    Magnetic resonance imaging (MRI) and metabolic nuclear magnetic resonance (NMR) spectroscopy are clinically available but have had little application in the quantification of experimental lung injury. There is a growing and unfulfilled need for predictive animal models that can improve our understanding of disease pathogenesis and therapeutic intervention. Integration of MRI and NMR could extend the application of experimental data into the clinical setting. This study investigated the ability of MRI and metabolic NMR to detect and quantify inflammation-mediated lung injury. Pulmonary inflammation was induced in male B6C3F1 mice by intratracheal administration of IL-1β and TNF-α under isoflurane anesthesia. Mice underwent MRI at 2, 4, 6, and 24 h after dosing. At 6 and 24 h lungs were harvested for metabolic NMR analysis. Data acquired from IL-1β+TNF-α-treated animals were compared with saline-treated control mice. The hyperintense-to-total lung volume (HTLV) ratio derived from MRI was higher in IL-1β+TNF-α-treated mice compared with control at 2, 4, and 6 h but returned to control levels by 24 h. The ability of MRI to detect pulmonary inflammation was confirmed by the association between HTLV ratio and histological and pathological end points. Principal component analysis of NMR-detectable metabolites also showed a temporal pattern for which energy metabolism-based biomarkers were identified. These data demonstrate that both MRI and metabolic NMR have utility in the detection and quantification of inflammation-mediated lung injury. Integration of these clinically available techniques into experimental models of lung injury could improve the translation of basic science knowledge and information to the clinic. PMID:18441091

  4. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967

  5. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  6. Development, optimization, and single laboratory validation of an event-specific real-time PCR method for the detection and quantification of Golden Rice 2 using a novel taxon-specific assay.

    PubMed

    Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco

    2015-02-18

    In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.

  7. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  8. [Evaluation study on a new method of dietary assessment with instant photography applied in urban pregnant women in Nanjing city].

    PubMed

    Jiang, Tingting; Dai, Yongmei; Miao, Miao; Zhang, Yue; Song, Chenglin; Wang, Zhixu

    2015-07-01

    To evaluate the usefulness and efficiency of a novel dietary method among urban pregnant women. Sixty one pregnant women were recruited from the ward and provided with a meal accurately weighed before cooking. The meal was photographed from three different angles before and after eating. The subjects were also interviewed for 24 h dietary recall by the investigators. Food weighting, image quantification and 24 h dietary recall were conducted by investigators from three different groups, and the messages were isolated from each other. Food consumption was analyzed on bases of classification and total summation. Nutrient intake from the meal was calculated for each subject. The data obtained from the dietary recall and the image quantification were compared with the actual values. Correlation and regression analyses were carried out on values between weight method and image quantification as well as dietary recall. Total twenty three kinds of food including rice, vegetables, fish, meats and soy bean curd were included in the experimental meal for the study. Compared with data from 24 h dietary recall (r = 0.413, P < 0.05), food weight estimated by image quantification (r = 0.778, P < 0.05, n = 308) were more correlated with weighed data, and show more concentrated linear distribution. Absolute difference distribution between image quantification and weight method of all food was 77.23 ± 56.02 (P < 0.05, n = 61), which was much small than the difference (172.77 ± 115.18) between 24 h recall and weight method. Values of almost all nutrients, including energy, protein, fat, carbohydrate, vitamin A, vitamin C, calcium, iron and zine calculated based on food weight from image quantification were more close to those of weighed data compared with 24 h dietary recall (P < 0.01). The results found by the Bland Altman analysis showed that the majority of the measurements for nutrient intake, were scattered along the mean difference line and close to the equality line (difference = 0). The plots show fairly good agreement between estimated and actual food consumption. It indicate that the differences (including the outliers) were random and did not exhibit any systematic bias, being consistent over different levels of mean food amount. On the other hand, the questionnaire showed that fifty six pregnant women considered the image quantification was less time-consuming and burdened than 24 h recall. Fifty eight of them would like to use image quantification to know their dietary status. The novel method which called instant photography (image quantification) for dietary assessment is more effective than conventional 24 h dietary recall and it also can obtain food intake values close to weighed data.

  9. Curcuminoid content of Curcuma longa L. and Curcuma xanthorrhiza rhizome based on drying method with NMR and HPLC-UVD

    NASA Astrophysics Data System (ADS)

    Hadi, S.; Artanti, A. N.; Rinanto, Y.; Wahyuni, D. S. C.

    2018-04-01

    Curcuminoid, consisting of curcumin, demethoxycurcumin and bis demethoxycurcumin, is the major compound in Curcuma longa L. and Curcuma xanthorrhiza rhizome. It has been known to have a potent antioxidants, anticancer, antibacteria activity. Those rhizomes needs to be dried beforehand which influenced the active compounds concentration. The present work was conducted to assess the curcuminoid content of C. longa L. and C. xanthorrhiza based on drying method with Nuclear Magnetic Resonance (NMR) and High Pressure Liquid Chromatography (HPLC)-UVD. Samples were collected and dried using freeze-drying and oven method. The latter is the common method applied in most drying method at herbal medicine preparation procedure. All samples were extracted using 96% ethanol and analyzed using NMR and HPLC-UVD. Curcuminoid as a bioactive compound in the sample exhibited no significant difference and weak significant difference in C. xanthorrhiza and C. longa L., respectively. HLPC-UVD as a reliable analytical method for the quantification is subsequently used to confirm of the data obtained by NMR. It resulted that curcuminoid content showed no significant difference in both samples. This replied that curcuminoids content in both samples were stable into heating process. These results are useful information for simplicia standardization method in pharmaceutical products regarding to preparation procedure.

  10. Quantification of taurine in energy drinks using ¹H NMR.

    PubMed

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Introducing AAA-MS, a rapid and sensitive method for amino acid analysis using isotope dilution and high-resolution mass spectrometry.

    PubMed

    Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie

    2012-07-06

    Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.

  12. Prediction of occult lymph node metastasis in squamous cell carcinoma of the oral cavity and the oropharynx using peritumoral Prospero homeobox protein 1 lymphatic nuclear quantification.

    PubMed

    Mermod, Maxime; Bongiovanni, Massimo; Petrova, Tatiana V; Dubikovskaya, Elena A; Simon, Christian; Tolstonog, Genrich; Monnier, Yan

    2016-09-01

    The use of lymphatic vessel density as a predictor of occult lymph node metastasis (OLNM) in head and neck squamous cell carcinoma (HNSCC) has never been reported. Staining of the specific lymphatic endothelial cells nuclear marker, PROX1, as an indicator of lymphatic vessel density was determined by counting the number of positive cells in squamous cell carcinomas (SCCs) of the oral cavity and the oropharynx with clinically negative necks. Correlation with histopathological data was established. Peritumoral PROX1 lymphatic nuclear count significantly correlated with the detection of OLNM in multivariate analysis (p < .005). The sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of this parameter was 60%, 95%, 85%, and 90%, respectively. Peritumoral PROX1 lymphatic nuclear count in primary SCCs of the oral cavity and the oropharynx allows accurate prediction of occult lymph node metastasis. © 2016 Wiley Periodicals, Inc. Head Neck 38: 1407-1415, 2016. © 2016 Wiley Periodicals, Inc.

  13. Whole-Body Computed Tomography-Based Body Mass and Body Fat Quantification: A Comparison to Hydrostatic Weighing and Air Displacement Plethysmography.

    PubMed

    Gibby, Jacob T; Njeru, Dennis K; Cvetko, Steve T; Heiny, Eric L; Creer, Andrew R; Gibby, Wendell A

    We correlate and evaluate the accuracy of accepted anthropometric methods of percent body fat (%BF) quantification, namely, hydrostatic weighing (HW) and air displacement plethysmography (ADP), to 2 automatic adipose tissue quantification methods using computed tomography (CT). Twenty volunteer subjects (14 men, 6 women) received head-to-toe CT scans. Hydrostatic weighing and ADP were obtained from 17 and 12 subjects, respectively. The CT data underwent conversion using 2 separate algorithms, namely, the Schneider method and the Beam method, to convert Hounsfield units to their respective tissue densities. The overall mass and %BF of both methods were compared with HW and ADP. When comparing ADP to CT data using the Schneider method and Beam method, correlations were r = 0.9806 and 0.9804, respectively. Paired t tests indicated there were no statistically significant biases. Additionally, observed average differences in %BF between ADP and the Schneider method and the Beam method were 0.38% and 0.77%, respectively. The %BF measured from ADP, the Schneider method, and the Beam method all had significantly higher mean differences when compared with HW (3.05%, 2.32%, and 1.94%, respectively). We have shown that total body mass correlates remarkably well with both the Schneider method and Beam method of mass quantification. Furthermore, %BF calculated with the Schneider method and Beam method CT algorithms correlates remarkably well with ADP. The application of these CT algorithms have utility in further research to accurately stratify risk factors with periorgan, visceral, and subcutaneous types of adipose tissue, and has the potential for significant clinical application.

  14. Direct quantification of fatty acids in wet microalgal and yeast biomass via a rapid in situ fatty acid methyl ester derivatization approach.

    PubMed

    Dong, Tao; Yu, Liang; Gao, Difeng; Yu, Xiaochen; Miao, Chao; Zheng, Yubin; Lian, Jieni; Li, Tingting; Chen, Shulin

    2015-12-01

    Accurate determination of fatty acid contents is routinely required in microalgal and yeast biofuel studies. A method of rapid in situ fatty acid methyl ester (FAME) derivatization directly from wet fresh microalgal and yeast biomass was developed in this study. This method does not require prior solvent extraction or dehydration. FAMEs were prepared with a sequential alkaline hydrolysis (15 min at 85 °C) and acidic esterification (15 min at 85 °C) process. The resulting FAMEs were extracted into n-hexane and analyzed using gas chromatography. The effects of each processing parameter (temperature, reaction time, and water content) upon the lipids quantification in the alkaline hydrolysis step were evaluated with a full factorial design. This method could tolerate water content up to 20% (v/v) in total reaction volume, which equaled up to 1.2 mL of water in biomass slurry (with 0.05-25 mg of fatty acid). There were no significant differences in FAME quantification (p>0.05) between the standard AOAC 991.39 method and the proposed wet in situ FAME preparation method. This fatty acid quantification method is applicable to fresh wet biomass of a wide range of microalgae and yeast species.

  15. [Progress in stable isotope labeled quantitative proteomics methods].

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  16. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  17. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  18. Development and interlaboratory validation of quantitative polymerase chain reaction method for screening analysis of genetically modified soybeans.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2013-01-01

    A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.

  19. Bayesian forecasting and uncertainty quantifying of stream flows using Metropolis–Hastings Markov Chain Monte Carlo algorithm

    DOE PAGES

    Wang, Hongrui; Wang, Cheng; Wang, Ying; ...

    2017-04-05

    This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less

  20. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    PubMed

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  1. Quantification of chitinase and thaumatin-like proteins in grape juices and wines.

    PubMed

    Le Bourse, D; Conreux, A; Villaume, S; Lameiras, P; Nuzillard, J-M; Jeandet, P

    2011-09-01

    Chitinases and thaumatin-like proteins are important grape proteins as they have a great influence on wine quality. The quantification of these proteins in grape juices and wines, along with their purification, is therefore crucial to study their intrinsic characteristics and the exact role they play in wines. The main isoforms of these two proteins from Chardonnay grape juice were thus purified by liquid chromatography. Two fast protein liquid chromatography (FLPC) steps allowed the fractionation and purification of the juice proteins, using cation exchange and hydrophobic interaction media. A further high-performance liquid chromatography (HPLC) step was used to achieve higher purity levels. Fraction assessment was achieved by mass spectrometry. Fraction purity was determined by HPLC to detect the presence of protein contaminants, and by nuclear magnetic resonance (NMR) spectroscopy to detect the presence of organic contaminants. Once pure fractions of lyophilized chitinase and thaumatin-like protein were obtained, ultra-HPLC (UHPLC) and enzyme-linked immunosorbent assay (ELISA) calibration curves were constructed. The quantification of these proteins in different grape juice and wine samples was thus achieved for the first time with both techniques through comparison with the purified protein calibration curve. UHPLC and ELISA showed very consistent results (less than 16% deviation for both proteins) and either could be considered to provide an accurate and reliable quantification of proteins in the oenology field.

  2. Downregulation of peroxisome proliferator-activated receptors (PPARs) in nasal polyposis

    PubMed Central

    Cardell, Lars-Olaf; Hägge, Magnus; Uddman, Rolf; Adner, Mikael

    2005-01-01

    Background Peroxisome proliferator-activated receptor (PPAR) α, βδ and γ are nuclear receptors activated by fatty acid metabolites. An anti-inflammatory role for these receptors in airway inflammation has been suggested. Methods Nasal biopsies were obtained from 10 healthy volunteers and 10 patients with symptomatic allergic rhinitis. Nasal polyps were obtained from 22 patients, before and after 4 weeks of local steroid treatment (fluticasone). Real-time RT-PCR was used for mRNA quantification and immunohistochemistry for protein localization and quantification. Results mRNA expression of PPARα, PPARβδ, PPARγ was found in all specimens. No differences in the expression of PPARs were obtained in nasal biopsies from patients with allergic rhinitis and healthy volunteers. Nasal polyps exhibited lower levels of PPARα and PPARγ than normal nasal mucosa and these levels were, for PPARγ, further reduced following steroid treatment. PPARγ immunoreactivity was detected in the epithelium, but also found in smooth muscle of blood vessels, glandular acini and inflammatory cells. Quantitative evaluation of the epithelial immunostaining revealed no differences between nasal biopsies from patients with allergic rhinitis and healthy volunteers. In polyps, the PPARγ immunoreactivity was lower than in nasal mucosa and further decreased after steroid treatment. Conclusion The down-regulation of PPARγ, in nasal polyposis but not in turbinates during symptomatic seasonal rhinitis, suggests that PPARγ might be of importance in long standing inflammations. PMID:16271155

  3. Novel quantitative real-time LCR for the sensitive detection of SNP frequencies in pooled DNA: method development, evaluation and application.

    PubMed

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-19

    Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food.

  4. Zero-Echo-Time and Dixon Deep Pseudo-CT (ZeDD CT): Direct Generation of Pseudo-CT Images for Pelvic PET/MRI Attenuation Correction Using Deep Convolutional Neural Networks with Multiparametric MRI.

    PubMed

    Leynes, Andrew P; Yang, Jaewon; Wiesinger, Florian; Kaushik, Sandeep S; Shanbhag, Dattesh D; Seo, Youngho; Hope, Thomas A; Larson, Peder E Z

    2018-05-01

    Accurate quantification of uptake on PET images depends on accurate attenuation correction in reconstruction. Current MR-based attenuation correction methods for body PET use a fat and water map derived from a 2-echo Dixon MRI sequence in which bone is neglected. Ultrashort-echo-time or zero-echo-time (ZTE) pulse sequences can capture bone information. We propose the use of patient-specific multiparametric MRI consisting of Dixon MRI and proton-density-weighted ZTE MRI to directly synthesize pseudo-CT images with a deep learning model: we call this method ZTE and Dixon deep pseudo-CT (ZeDD CT). Methods: Twenty-six patients were scanned using an integrated 3-T time-of-flight PET/MRI system. Helical CT images of the patients were acquired separately. A deep convolutional neural network was trained to transform ZTE and Dixon MR images into pseudo-CT images. Ten patients were used for model training, and 16 patients were used for evaluation. Bone and soft-tissue lesions were identified, and the SUV max was measured. The root-mean-squared error (RMSE) was used to compare the MR-based attenuation correction with the ground-truth CT attenuation correction. Results: In total, 30 bone lesions and 60 soft-tissue lesions were evaluated. The RMSE in PET quantification was reduced by a factor of 4 for bone lesions (10.24% for Dixon PET and 2.68% for ZeDD PET) and by a factor of 1.5 for soft-tissue lesions (6.24% for Dixon PET and 4.07% for ZeDD PET). Conclusion: ZeDD CT produces natural-looking and quantitatively accurate pseudo-CT images and reduces error in pelvic PET/MRI attenuation correction compared with standard methods. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.

  5. The determination and quantification of photosynthetic pigments by reverse phase high-performance liquid chromatography, thin-layer chromatography, and spectrophotometry.

    PubMed

    Pocock, Tessa; Król, Marianna; Huner, Norman P A

    2004-01-01

    Chorophylls and carotenoids are functionally important pigment molecules in photosynthetic organisms. Methods for the determination of chlorophylls a and b, beta-carotene, neoxanthin, and the pigments that are involved in photoprotective cycles such as the xanthophylls are discussed. These cycles involve the reversible de-epoxidation of violaxanthin into antheraxanthin and zeaxanthin, as well as the reversible de-epoxidation of lutein-5,6-epoxide into lutein. This chapter describes pigment extraction procedures from higher plants and green algae. Methods for the determination and quantification using high-performance liquid chromatograpy (HPLC) are described as well as methods for the separation and purification of pigments for use as standards using thin-layer chromatography (TLC). In addition, several spectrophotometric methods for the quantification of chlorophylls a and b are described.

  6. Metering error quantification under voltage and current waveform distortion

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  7. A simple and fast method for extraction and quantification of cryptophyte phycoerythrin.

    PubMed

    Thoisen, Christina; Hansen, Benni Winding; Nielsen, Søren Laurentius

    2017-01-01

    The microalgal pigment phycoerythrin (PE) is of commercial interest as natural colorant in food and cosmetics, as well as fluoroprobes for laboratory analysis. Several methods for extraction and quantification of PE are available but they comprise typically various extraction buffers, repetitive freeze-thaw cycles and liquid nitrogen, making extraction procedures more complicated. A simple method for extraction of PE from cryptophytes is described using standard laboratory materials and equipment. The cryptophyte cells on the filters were disrupted at -80 °C and added phosphate buffer for extraction at 4 °C followed by absorbance measurement. The cryptophyte Rhodomonas salina was used as a model organism. •Simple method for extraction and quantification of phycoerythrin from cryptophytes.•Minimal usage of equipment and chemicals, and low labor costs.•Applicable for industrial and biological purposes.

  8. Evaluation of the performance of quantitative detection of the Listeria monocytogenes prfA locus with droplet digital PCR.

    PubMed

    Witte, Anna Kristina; Fister, Susanne; Mester, Patrick; Schoder, Dagmar; Rossmanith, Peter

    2016-11-01

    Fast and reliable pathogen detection is an important issue for human health. Since conventional microbiological methods are rather slow, there is growing interest in detection and quantification using molecular methods. The droplet digital polymerase chain reaction (ddPCR) is a relatively new PCR method for absolute and accurate quantification without external standards. Using the Listeria monocytogenes specific prfA assay, we focused on the questions of whether the assay was directly transferable to ddPCR and whether ddPCR was suitable for samples derived from heterogeneous matrices, such as foodstuffs that often included inhibitors and a non-target bacterial background flora. Although the prfA assay showed suboptimal cluster formation, use of ddPCR for quantification of L. monocytogenes from pure bacterial cultures, artificially contaminated cheese, and naturally contaminated foodstuff was satisfactory over a relatively broad dynamic range. Moreover, results demonstrated the outstanding detection limit of one copy. However, while poorer DNA quality, such as resulting from longer storage, can impair ddPCR, internal amplification control (IAC) of prfA by ddPCR, that is integrated in the genome of L. monocytogenes ΔprfA, showed even slightly better quantification over a broader dynamic range. Graphical Abstract Evaluating the absolute quantification potential of ddPCR targeting Listeria monocytogenes prfA.

  9. Identification and absolute quantification of enzymes in laundry detergents by liquid chromatography tandem mass spectrometry.

    PubMed

    Gaubert, Alexandra; Jeudy, Jérémy; Rougemont, Blandine; Bordes, Claire; Lemoine, Jérôme; Casabianca, Hervé; Salvador, Arnaud

    2016-07-01

    In a stricter legislative context, greener detergent formulations are developed. In this way, synthetic surfactants are frequently replaced by bio-sourced surfactants and/or used at lower concentrations in combination with enzymes. In this paper, a LC-MS/MS method was developed for the identification and quantification of enzymes in laundry detergents. Prior to the LC-MS/MS analyses, a specific sample preparation protocol was developed due to matrix complexity (high surfactant percentages). Then for each enzyme family mainly used in detergent formulations (protease, amylase, cellulase, and lipase), specific peptides were identified on a high resolution platform. A LC-MS/MS method was then developed in selected reaction monitoring (SRM) MS mode for the light and corresponding heavy peptides. The method was linear on the peptide concentration ranges 25-1000 ng/mL for protease, lipase, and cellulase; 50-1000 ng/mL for amylase; and 5-1000 ng/mL for cellulase in both water and laundry detergent matrices. The application of the developed analytical strategy to real commercial laundry detergents enabled enzyme identification and absolute quantification. For the first time, identification and absolute quantification of enzymes in laundry detergent was realized by LC-MS/MS in a single run. Graphical Abstract Identification and quantification of enzymes by LC-MS/MS.

  10. A phytochemical comparison of saw palmetto products using gas chromatography and (1) H nuclear magnetic resonance spectroscopy metabolomic profiling.

    PubMed

    Booker, Anthony; Suter, Andy; Krnjic, Ana; Strassel, Brigitte; Zloh, Mire; Said, Mazlina; Heinrich, Michael

    2014-06-01

    Preparations containing saw palmetto berries are used in the treatment of benign prostatic hyperplasia (BPH). There are many products on the market, and relatively little is known about their chemical variability and specifically the composition and quality of different saw palmetto products notwithstanding that in 2000, an international consultation paper from the major urological associations from the five continents on treatments for BPH demanded further research on this topic. Here, we compare two analytical approaches and characterise 57 different saw palmetto products. An established method - gas chromatography - was used for the quantification of nine fatty acids, while a novel approach of metabolomic profiling using (1) H nuclear magnetic resonance (NMR) spectroscopy was used as a fingerprinting tool to assess the overall composition of the extracts. The phytochemical analysis determining the fatty acids showed a high level of heterogeneity of the different products in the total amount and of nine single fatty acids. A robust and reproducible (1) H NMR spectroscopy method was established, and the results showed that it was possible to statistically differentiate between saw palmetto products that had been extracted under different conditions but not between products that used a similar extraction method. Principal component analysis was able to determine those products that had significantly different metabolites. The metabolomic approach developed offers novel opportunities for quality control along the value chain of saw palmetto and needs to be followed further, as with this method, the complexity of a herbal extract can be better assessed than with the analysis of a single group of constituents. © 2014 The Authors. Journal of Pharmacy and Pharmacology published by John Wiley & Sons Ltd on behalf of Royal Pharmaceutical Society.

  11. Methods for the quantification of coarse woody debris and an examination of its spatial patterning: A study from the Tenderfoot Creek Experimental Forest, MT

    Treesearch

    Paul B. Alaback; Duncan C. Lutes

    1997-01-01

    Methods for the quantification of coarse woody debris volume and the description of spatial patterning were studied in the Tenderfoot Creek Experimental Forest, Montana. The line transect method was found to be an accurate, unbiased estimator of down debris volume (> 10cm diameter) on 1/4 hectare fixed-area plots, when perpendicular lines were used. The Fischer...

  12. Uncertainty quantification of effective nuclear interactions

    DOE PAGES

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    2016-03-02

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  13. Uncertainty quantification of effective nuclear interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  14. Inverse modelling of radionuclide release rates using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Hamburger, Thomas; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian

    2014-05-01

    Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. The hazardous consequences reach out on a national and continental scale. Environmental measurements and methods to model the transport and dispersion of the released radionuclides serve as a platform to assess the regional impact of nuclear accidents - both, for research purposes and, more important, to determine the immediate threat to the population. However, the assessments of the regional radionuclide activity concentrations and the individual exposure to radiation dose underlie several uncertainties. For example, the accurate model representation of wet and dry deposition. One of the most significant uncertainty, however, results from the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source terms of severe nuclear accidents may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on rather rough estimates of released key radionuclides given by the operators. Precise measurements are mostly missing due to practical limitations during the accident. Inverse modelling can be used to realise a feasible estimation of the source term (Davoine and Bocquet, 2007). Existing point measurements of radionuclide activity concentrations are therefore combined with atmospheric transport models. The release rates of radionuclides at the accident site are then obtained by improving the agreement between the modelled and observed concentrations (Stohl et al., 2012). The accuracy of the method and hence of the resulting source term depends amongst others on the availability, reliability and the resolution in time and space of the observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates on the other hand are observed routinely on a much denser grid and higher temporal resolution. Gamma dose rate measurements contain no explicit information on the observed spectrum of radionuclides and have to be interpreted carefully. Nevertheless, they provide valuable information for the inverse evaluation of the source term due to their availability (Saunier et al., 2013). We present a new inversion approach combining an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The gamma dose rates are calculated from the modelled activity concentrations. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008). The a priori information on the source term is a first guess. The gamma dose rate observations will be used with inverse modelling to improve this first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.

  15. High-performance Thin-layer Chromatographic-densitometric Quantification and Recovery of Bioactive Compounds for Identification of Elite Chemotypes of Gloriosa superba L. Collected from Sikkim Himalayas (India)

    PubMed Central

    Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md.; Singh Rawat, Ajay Kumar; Srivastava, Sharad

    2017-01-01

    Background: Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. Objective: This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). Methods: The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λmax of 350 nm. Results: Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100–400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. Conclusion: The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. SUMMARY An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers. PMID:29142436

  16. Development and validation of a rapid and simple LC-MS/MS method for quantification of vemurafenib in human plasma: application to a human pharmacokinetic study.

    PubMed

    Bihan, Kevin; Sauzay, Chloé; Goldwirt, Lauriane; Charbonnier-Beaupel, Fanny; Hulot, Jean-Sebastien; Funck-Brentano, Christian; Zahr, Noël

    2015-02-01

    Vemurafenib (Zelboraf) is a new tyrosine kinase inhibitor that selectively targets activated BRAF V600E gene and is indicated for the treatment of advanced BRAF mutation-positive melanoma. We developed a simple method for vemurafenib quantification using liquid chromatography-tandem mass spectrometry. A stability study of vemurafenib in human plasma was also performed. (13)C(6)-vemurafenib was used as the internal standard. A single-step protein precipitation was used for plasma sample preparation. Chromatography was performed on an Acquity UPLC system (Waters) with chromatographic separation by the use of an Acquity UPLC BEH C18 column (2.1 × 50 mm, 1.7-mm particle size; Waters). Quantification was performed using the monitoring of multiple reactions of following transitions: m/z 488.2 → 381.0 for vemurafenib and m/z 494.2 → 387.0 for internal standard. This method was linear over the range from 1.0 to 100.0 mcg/mL. The lower limit of quantification was 0.1 mcg/mL for vemurafenib in plasma. Vemurafenib remained stable for 1 month at all levels tested, when stored indifferently at room temperature (20 °C), at +4 °C, or at -20 °C. This method was used successfully to perform a plasma pharmacokinetic study of vemurafenib in a patient after oral administration at a steady state. This liquid chromatography-tandem mass spectrometry method for vemurafenib quantification in human plasma is simple, rapid, specific, sensitive, accurate, precise, and reliable.

  17. Three-dimensional microstructural characterization of bulk plutonium and uranium metals using focused ion beam technique

    NASA Astrophysics Data System (ADS)

    Chung, Brandon W.; Erler, Robert G.; Teslich, Nick E.

    2016-05-01

    Nuclear forensics requires accurate quantification of discriminating microstructural characteristics of the bulk nuclear material to identify its process history and provenance. Conventional metallographic preparation techniques for bulk plutonium (Pu) and uranium (U) metals are limited to providing information in two-dimension (2D) and do not allow for obtaining depth profile of the material. In this contribution, use of dual-beam focused ion-beam/scanning electron microscopy (FIB-SEM) to investigate the internal microstructure of bulk Pu and U metals is demonstrated. Our results demonstrate that the dual-beam methodology optimally elucidate microstructural features without preparation artifacts, and the three-dimensional (3D) characterization of inner microstructures can reveal salient microstructural features that cannot be observed from conventional metallographic techniques. Examples are shown to demonstrate the benefit of FIB-SEM in improving microstructural characterization of microscopic inclusions, particularly with respect to nuclear forensics.

  18. Three-dimensional microstructural characterization of bulk plutonium and uranium metals using focused ion beam technique

    DOE PAGES

    Chung, Brandon W.; Erler, Robert G.; Teslich, Nick E.

    2016-03-03

    Nuclear forensics requires accurate quantification of discriminating microstructural characteristics of the bulk nuclear material to identify its process history and provenance. Conventional metallographic preparation techniques for bulk plutonium (Pu) and uranium (U) metals are limited to providing information in two-dimension (2D) and do not allow for obtaining depth profile of the material. In this contribution, use of dual-beam focused ion-beam/scanning electron microscopy (FIB-SEM) to investigate the internal microstructure of bulk Pu and U metals is demonstrated. Our results demonstrate that the dual-beam methodology optimally elucidate microstructural features without preparation artifacts, and the three-dimensional (3D) characterization of inner microstructures can revealmore » salient microstructural features that cannot be observed from conventional metallographic techniques. As a result, examples are shown to demonstrate the benefit of FIB-SEM in improving microstructural characterization of microscopic inclusions, particularly with respect to nuclear forensics.« less

  19. Modeling the potential radionuclide transport by the Ob and Yenisey Rivers to the Kara Sea.

    PubMed

    Paluszkiewicz, T; Hibler, L F; Richmond, M C; Bradley, D J; Thomas, S A

    2001-01-01

    A major portion of the former Soviet Union (FSU) nuclear program is located in the West Siberian Basin. Among the many nuclear facilities are three production reactors and the spent nuclear fuel reprocessing sites, Mayak, Tomsk-7, and Krasnoyarsk-26, which together are probably responsible for the majority of the radioactive contamination found in the Ob and Yenisey River systems that feed into the Arctic Ocean through the Kara Sea. This manuscript describes ongoing research to estimate radionuclide fluxes to the Kara Sea from these river systems. Our approach is to apply a hierarchy of simple models that use existing and forthcoming data to quantify the transport and fate of radionuclide contaminants via various environmental pathways. We present an initial quantification of the contaminant inventory, hydrology, meteorology, and sedimentology of the Ob River system and preliminary conclusions from portions of the Ob River model.

  20. Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer

    NASA Astrophysics Data System (ADS)

    Schulte, Horst

    2016-09-01

    A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.

  1. In situ DNA hybridized chain reaction (FISH-HCR) as a better method for quantification of bacteria and archaea within marine sediment

    NASA Astrophysics Data System (ADS)

    Buongiorno, J.; Lloyd, K. G.; Shumaker, A.; Schippers, A.; Webster, G.; Weightman, A.; Turner, S.

    2015-12-01

    Nearly 75% of the Earth's surface is covered by marine sediment that is home to an estimated 2.9 x 1029 microbial cells. A substantial impediment to understanding the abundance and distribution of cells within marine sediment is the lack of a consistent and reliable method for their taxon-specific quantification. Catalyzed reporter fluorescent in situ hybridization (CARD-FISH) provides taxon-specific enumeration, but this process requires passing a large enzyme through cell membranes, decreasing its precision relative to general cell counts using a small DNA stain. In 2015, Yamaguchi et al. developed FISH hybridization chain reaction (FISH-HCR) as an in situ whole cell detection method for environmental microorganisms. FISH-HCR amplifies the fluorescent signal, as does CARD-FISH, but it allows for milder cell permeation methods that might prevent yield loss. To compare FISH-HCR to CARD-FISH, we examined bacteria and archaea cell counts within two sediment cores, Lille Belt (~78 meters deep) and Landsort Deep (90 meters deep), which were retrieved from the Baltic Sea Basin during IODP Expedition 347. Preliminary analysis shows that CARD-FISH counts are below the quantification limit for most depths across both cores. By contrast, quantification of cells was possible with FISH-HCR in all examined depths. When quantification with CARD-FISH was above the limit of detection, counts with FISH-HCR were up to 11 fold higher for Bacteria and 3 fold higher for Archaea from the same sediment sample. Further, FISH-HCR counts follow the trends of on board counts nicely, indicating that FISH-HCR may better reflect the cellular abundance within marine sediment than other quantification methods, including qPCR. Using FISH-HCR, we found that archaeal cell counts were on average greater than bacterial cell counts, but within the same order of magnitude.

  2. Optimized, Fast-Throughput UHPLC-DAD Based Method for Carotenoid Quantification in Spinach, Serum, Chylomicrons, and Feces.

    PubMed

    Eriksen, Jane N; Madsen, Pia L; Dragsted, Lars O; Arrigoni, Eva

    2017-02-01

    An improved UHPLC-DAD-based method was developed and validated for quantification of major carotenoids present in spinach, serum, chylomicrons, and feces. Separation was achieved with gradient elution within 12.5 min for six dietary carotenoids and the internal standard, echinenone. The proposed method provides, for all standard components, resolution > 1.1, linearity covering the target range (R > 0.99), LOQ < 0.035 mg/L, and intraday and interday RSDs < 2 and 10%, respectively. Suitability of the method was tested on biological matrices. Method precision (RSD%) for carotenoid quantification in serum, chylomicrons, and feces was below 10% for intra- and interday analysis, except for lycopene. Method accuracy was consistent with mean recoveries ranging from 78.8 to 96.9% and from 57.2 to 96.9% for all carotenoids, except for lycopene, in serum and feces, respectively. Additionally, an interlaboratory validation study on spinach at two institutions showed no significant differences in lutein or β-carotene content, when evaluated on four occasions.

  3. Validated Method for the Quantification of Baclofen in Human Plasma Using Solid-Phase Extraction and Liquid Chromatography–Tandem Mass Spectrometry

    PubMed Central

    Nahar, Limon Khatun; Cordero, Rosa Elena; Nutt, David; Lingford-Hughes, Anne; Turton, Samuel; Durant, Claire; Wilson, Sue; Paterson, Sue

    2016-01-01

    Abstract A highly sensitive and fully validated method was developed for the quantification of baclofen in human plasma. After adjusting the pH of the plasma samples using a phosphate buffer solution (pH 4), baclofen was purified using mixed mode (C8/cation exchange) solid-phase extraction (SPE) cartridges. Endogenous water-soluble compounds and lipids were removed from the cartridges before the samples were eluted and concentrated. The samples were analyzed using triple-quadrupole liquid chromatography–tandem mass spectrometry (LC–MS-MS) with triggered dynamic multiple reaction monitoring mode for simultaneous quantification and confirmation. The assay was linear from 25 to 1,000 ng/mL (r2 > 0.999; n = 6). Intraday (n = 6) and interday (n = 15) imprecisions (% relative standard deviation) were <5%, and the average recovery was 30%. The limit of detection of the method was 5 ng/mL, and the limit of quantification was 25 ng/mL. Plasma samples from healthy male volunteers (n = 9, median age: 22) given two single oral doses of baclofen (10 and 60 mg) on nonconsecutive days were analyzed to demonstrate method applicability. PMID:26538544

  4. Highly sensitive quantification for human plasma-targeted metabolomics using an amine derivatization reagent.

    PubMed

    Arashida, Naoko; Nishimoto, Rumi; Harada, Masashi; Shimbo, Kazutaka; Yamada, Naoyuki

    2017-02-15

    Amino acids and their related metabolites play important roles in various physiological processes and have consequently become biomarkers for diseases. However, accurate quantification methods have only been established for major compounds, such as amino acids and a limited number of target metabolites. We previously reported a highly sensitive high-throughput method for the simultaneous quantification of amines using 3-aminopyridyl-N-succinimidyl carbamate as a derivatization reagent combined with liquid chromatography-tandem mass spectrometry (LC-MS/MS). Herein, we report the successful development of a practical and accurate LC-MS/MS method to analyze low concentrations of 40 physiological amines in 19 min. Thirty-five of these amines showed good linearity, limits of quantification, accuracy, precision, and recovery characteristics in plasma, with scheduled selected reaction monitoring acquisitions. Plasma samples from 10 healthy volunteers were evaluated using our newly developed method. The results revealed that 27 amines were detected in one of the samples, and that 24 of these compounds could be quantified. Notably, this new method successfully quantified metabolites with high accuracy across three orders of magnitude, with lowest and highest averaged concentrations of 31.7 nM (for spermine) and 18.3 μM (for α-aminobutyric acid), respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Volumetric adsorptive microsampling-liquid chromatography tandem mass spectrometry assay for the simultaneous quantification of four antibiotics in human blood: Method development, validation and comparison with dried blood spot.

    PubMed

    Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana

    2017-10-25

    In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  7. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less

  8. In-line UV spectroscopy for the quantification of low-dose active ingredients during the manufacturing of pharmaceutical semi-solid and liquid formulations.

    PubMed

    Bostijn, N; Hellings, M; Van Der Veen, M; Vervaet, C; De Beer, T

    2018-07-12

    UltraViolet (UV) spectroscopy was evaluated as an innovative Process Analytical Technology (PAT) - tool for the in-line and real-time quantitative determination of low-dosed active pharmaceutical ingredients (APIs) in a semi-solid (gel) and a liquid (suspension) pharmaceutical formulation during their batch production process. The performance of this new PAT-tool (i.e., UV spectroscopy) was compared with an already more established PAT-method based on Raman spectroscopy. In-line UV measurements were carried out with an immersion probe while for the Raman measurements a non-contact PhAT probe was used. For both studied formulations, an in-line API quantification model was developed and validated per spectroscopic technique. The known API concentrations (Y) were correlated with the corresponding in-line collected preprocessed spectra (X) through a Partial Least Squares (PLS) regression. Each developed quantification method was validated by calculating the accuracy profile on the basis of the validation experiments. Furthermore, the measurement uncertainty was determined based on the data generated for the determination of the accuracy profiles. From the accuracy profile of the UV- and Raman-based quantification method for the gel, it was concluded that at the target API concentration of 2% (w/w), 95 out of 100 future routine measurements given by the Raman method will not deviate more than 10% (relative error) from the true API concentration, whereas for the UV method the acceptance limits of 10% were exceeded. For the liquid formulation, the Raman method was not able to quantify the API in the low-dosed suspension (0.09% (w/w) API). In contrast, the in-line UV method was able to adequately quantify the API in the suspension. This study demonstrated that UV spectroscopy can be adopted as a novel in-line PAT-technique for low-dose quantification purposes in pharmaceutical processes. Important is that none of the two spectroscopic techniques was superior to the other for both formulations: the Raman method was more accurate in quantifying the API in the gel (2% (w/w) API), while the UV method performed better for API quantification in the suspension (0.09% (w/w) API). Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Fast microwave-assisted extraction of rotenone for its quantification in seeds of yam bean (Pachyrhizus sp.).

    PubMed

    Lautié, Emmanuelle; Rasse, Catherine; Rozet, Eric; Mourgues, Claire; Vanhelleputte, Jean-Paul; Quetin-Leclercq, Joëlle

    2013-02-01

    The aim of this study was to find if fast microwave-assisted extraction could be an alternative to the conventional Soxhlet extraction for the quantification of rotenone in yam bean seeds by SPE and HPLC-UV. For this purpose, an experimental design was used to determine the optimal conditions of the microwave extraction. Then the values of the quantification on three accessions from two different species of yam bean seeds were compared using the two different kinds of extraction. A microwave extraction of 11 min at 55°C using methanol/dichloromethane (50:50) allowed rotenone extraction either equivalently or more efficiently than the 8-h-Soxhlet extraction method and was less sensitive to moisture content. The selectivity, precision, trueness, accuracy, and limit of quantification of the method with microwave extraction were also demonstrated. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Quantification of Fibrosis and Osteosclerosis in Myeloproliferative Neoplasms: A Computer-Assisted Image Study

    PubMed Central

    Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.

    2010-01-01

    Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729

  11. Light Water Reactor Sustainability Program FY13 Status Update for EPRI - RISMC Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management with the aim to improve economics, reliability, and sustain safety of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced "RISMC toolkit" that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho Nationalmore » Laboratory (INL) is collaborating with the Electric Power Research Institute (EPRI) in order to focus on applications of interest to the U.S. nuclear power industry. This report documents the collaboration activities performed between INL and EPRI during FY2013.« less

  12. A Strategy for Simultaneous Isolation of Less Polar Ginsenosides, Including a Pair of New 20-Methoxyl Isomers, from Flower Buds of Panax ginseng.

    PubMed

    Li, Sha-Sha; Li, Ke-Ke; Xu, Fei; Tao, Li; Yang, Li; Chen, Shu-Xiao; Gong, Xiao-Jie

    2017-03-10

    The present study was designed to simultaneously isolate the less polar ginsenosides from the flower buds of Panax ginseng (FBPG). Five ginsenosides, including a pair of new 20-methoxyl isomers, were extracted from FBPG and purified through a five-step integrated strategy, by combining ultrasonic extraction, Diaion Hp-20 macroporous resin column enrichment, solid phase extraction (SPE), reversed-phase high-performance liquid chromatography (RP-HPLC) analysis and preparation, and nuclear magnetic resonance (NMR) analysis. The quantification of the five ginsenosides was also discussed by a developed method with validations within acceptable limits. Ginsenoside Rg5 showed content of about 1% in FBPG. The results indicated that FBPG might have many different ginsenosides with diverse chemical structures, and the less polar ginsenosides were also important to the quality control and standardization of FBPG.

  13. Presence, segregation and reactivity of H, C and N dissolved in some refractory oxides

    NASA Technical Reports Server (NTRS)

    Freund, F.

    1986-01-01

    The sources of impurities, particularly carbon, in high melting oxides and silicates are discussed, along with detection and quantification methods. The impurities are important for their effects on bulk material properties through the media of, e.g., surface or grain boundary characteristics. The impurities are usually encountered by the contact of the oxide (refractory) material with volatiles such as H2O and CO2, which become incorporated in the material and form anion complexes with oxygen acting as a covalent bonded ligand. The specific processes undergone by MgO in assimilating C impurities are delineated, using data obtained with X-ray photoelectron spectroscopy, Auger electron spectroscopy, secondary ion mass spectrometry and nuclear reaction profiling. Finally, maintenance of a supersaturated solid solution with C impurities by space charge control is described as a means of offset impurity effects.

  14. BioShuttle-mediated Plasmid Transfer

    PubMed Central

    Braun, Klaus; von Brasch, Leonie; Pipkorn, Ruediger; Ehemann, Volker; Jenne, Juergen; Spring, Herbert; Debus, Juergen; Didinger, Bernd; Rittgen, Werner; Waldeck, Waldemar

    2007-01-01

    An efficient gene transfer into target tissues and cells is needed for safe and effective treatment of genetic diseases like cancer. In this paper, we describe the development of a transport system and show its ability for transporting plasmids. This non-viral peptide-based BioShuttle-mediated transfer system consists of a nuclear localization address sequence realizing the delivery of the plasmid phNIS-IRES-EGFP coding for two independent reporter genes into nuclei of HeLa cells. The quantification of the transfer efficiency was achieved by measurements of the sodium iodide symporter activity. EGFP gene expression was measured with Confocal Laser Scanning Microscopy and quantified with biostatistical methods by analysis of the frequency of the amplitude distribution in the CLSM images. The results demonstrate that the “BioShuttle”-Technology is an appropriate tool for an effective transfer of genetic material carried by a plasmid. PMID:18026568

  15. A comprehensive NMR methodology to assess the composition of biobased and biodegradable polymers in contact with food.

    PubMed

    Gratia, Audrey; Merlet, Denis; Ducruet, Violette; Lyathaud, Cédric

    2015-01-01

    A nuclear magnetic resonance (NMR) methodology was assessed regarding the identification and quantification of additives in three types of polylactide (PLA) intended as food contact materials. Additives were identified using the LNE/NMR database which clusters NMR datasets on more than 130 substances authorized by European Regulation No. 10/2011. Of the 12 additives spiked in the three types of PLA pellets, 10 were rapidly identified by the database and correlated with spectral comparison. The levels of the 12 additives were estimated using quantitative NMR combined with graphical computation. A comparison with chromatographic methods tended to prove the sensitivity of NMR by demonstrating an analytical difference of less than 15%. Our results therefore demonstrated the efficiency of the proposed NMR methodology for rapid assessment of the composition of PLA. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.

    PubMed

    Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio

    2018-01-01

    Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.

  17. Validation of a Sulfuric Acid Digestion Method for Inductively Coupled Plasma Mass Spectrometry Quantification of TiO2 Nanoparticles.

    PubMed

    Watkins, Preston S; Castellon, Benjamin T; Tseng, Chiyen; Wright, Moncie V; Matson, Cole W; Cobb, George P

    2018-04-13

    A consistent analytical method incorporating sulfuric acid (H 2 SO 4 ) digestion and ICP-MS quantification has been developed for TiO 2 quantification in biotic and abiotic environmentally relevant matrices. Sample digestion in H 2 SO 4 at 110°C provided consistent results without using hydrofluoric acid or microwave digestion. Analysis of seven replicate samples for four matrices on each of 3 days produced Ti recoveries of 97% ± 2.5%, 91 % ± 4.0%, 94% ± 1.8%, and 73 % ± 2.6% (mean ± standard deviation) from water, fish tissue, periphyton, and sediment, respectively. The method demonstrated consistent performance in analysis of water collected over a 1 month.

  18. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  19. Quantification of N-acetyl- and N-glycolylneuraminic acids by a stable isotope dilution assay using high-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Allevi, Pietro; Femia, Eti Alessandra; Costa, Maria Letizia; Cazzola, Roberta; Anastasia, Mario

    2008-11-28

    The present report describes a method for the quantification of N-acetyl- and N-glycolylneuraminic acids without any derivatization, using their (13)C(3)-isotopologues as internal standards and a C(18) reversed-phase column modified by decylboronic acid which allows for the first time a complete chromatographic separation between the two analytes. The method is based on high-performance liquid chromatographic coupled with electrospray ion-trap mass spectrometry. The limit of quantification of the method is 0.1mg/L (2.0ng on column) for both analytes. The calibration curves are linear for both sialic acids over the range of 0.1-80mg/L (2.0-1600ng on column) with a correlation coefficient greater than 0.997. The proposed method was applied to the quantitative determination of sialic acids released from fetuin as a model of glycoproteins.

  20. Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.

    PubMed

    Becerra, Valentina; Odermatt, Jürgen

    2013-02-01

    This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Multiplex quantification of 12 European Union authorized genetically modified maize lines with droplet digital polymerase chain reaction.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana

    2015-08-18

    Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.

  2. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  3. Comparative study between extraction techniques and column separation for the quantification of sinigrin and total isothiocyanates in mustard seed.

    PubMed

    Cools, Katherine; Terry, Leon A

    2012-07-15

    Glucosinolates are β-thioglycosides which are found naturally in Cruciferae including the genus Brassica. When enzymatically hydrolysed, glucosinolates yield isothiocyanates and give a pungent taste. Both glucosinolates and isothiocyanates have been linked with anticancer activity as well as antifungal and antibacterial properties and therefore the quantification of these compounds is scientifically important. A wide range of literature exists on glucosinolates, however the extraction and quantification procedures differ greatly resulting in discrepancies between studies. The aim of this study was therefore to compare the most popular extraction procedures to identify the most efficacious method and whether each extraction can also be used for the quantification of total isothiocyanates. Four extraction techniques were compared for the quantification of sinigrin from mustard cv. Centennial (Brassica juncea L.) seed; boiling water, boiling 50% (v/v) aqueous acetonitrile, boiling 100% methanol and 70% (v/v) aqueous methanol at 70 °C. Prior to injection into the HPLC, the extractions which involved solvents (acetonitrile or methanol) were freeze-dried and resuspended in water. To identify whether the same extract could be used to measure total isothiocyanates, a dichloromethane extraction was carried out on the sinigrin extracts. For the quantification of sinigrin alone, boiling 50% (v/v) acetonitrile was found to be the most efficacious extraction solvent of the four tested yielding 15% more sinigrin than the water extraction. However, the removal of the acetonitrile by freeze-drying had a negative impact on the isothiocyanate content. Quantification of both sinigrin and total isothiocyanates was possible when the sinigrin was extracted using boiling water. Two columns were compared for the quantification of sinigrin revealing the Zorbax Eclipse to be the best column using this particular method. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Quantification of VX Nerve Agent in Various Food Matrices by Solid-Phase Extraction Ultra-Performance Liquid ChromatographyTime-of-Flight Mass Spectrometry

    DTIC Science & Technology

    2016-04-01

    QUANTIFICATION OF VX NERVE AGENT IN VARIOUS FOOD MATRICES BY SOLID-PHASE EXTRACTION ULTRA-PERFORMANCE...TITLE AND SUBTITLE Quantification of VX Nerve Agent in Various Food Matrices by Solid-Phase Extraction Ultra-Performance Liquid Chromatography... food matrices. The mixed-mode cation exchange (MCX) sorbent and Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) methods were used for

  5. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  6. An analysis of potassium iodide (KI) prophylaxis for the general public in the event of a nuclear accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behling, H.; Behling, K.; Amarasooriya, H.

    1995-02-01

    A generic difficulty encountered in cost-benefit analyses is the quantification of major elements that define the costs and the benefits in commensurate units. In this study, the costs of making KI available for public use, and the avoidance of thyroidal health effects predicted to be realized from the availability of that KI (i.e., the benefits), are defined in the commensurate units of dollars.

  7. Development and validation of an event-specific quantitative PCR method for genetically modified maize MIR162.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2014-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.

  8. Image-guided spatial localization of heterogeneous compartments for magnetic resonance

    PubMed Central

    An, Li; Shen, Jun

    2015-01-01

    Purpose: Image-guided localization SPectral Localization Achieved by Sensitivity Heterogeneity (SPLASH) allows rapid measurement of signals from irregularly shaped anatomical compartments without using phase encoding gradients. Here, the authors propose a novel method to address the issue of heterogeneous signal distribution within the localized compartments. Methods: Each compartment was subdivided into multiple subcompartments and their spectra were solved by Tikhonov regularization to enforce smoothness within each compartment. The spectrum of a given compartment was generated by combining the spectra of the components of that compartment. The proposed method was first tested using Monte Carlo simulations and then applied to reconstructing in vivo spectra from irregularly shaped ischemic stroke and normal tissue compartments. Results: Monte Carlo simulations demonstrate that the proposed regularized SPLASH method significantly reduces localization and metabolite quantification errors. In vivo results show that the intracompartment regularization results in ∼40% reduction of error in metabolite quantification. Conclusions: The proposed method significantly reduces localization errors and metabolite quantification errors caused by intracompartment heterogeneous signal distribution. PMID:26328977

  9. Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.

    PubMed

    Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei

    2017-09-01

    Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Optimization of PCR for quantification of simian immunodeficiency virus (SIV) genomic RNA in plasma of rhesus macaques (Macaca mulatta) using armored RNA

    PubMed Central

    Monjure, C. J.; Tatum, C. D.; Panganiban, A. T.; Arainga, M.; Traina-Dorge, V.; Marx, P. A.; Didier, E. S.

    2014-01-01

    Introduction Quantification of plasma viral load (PVL) is used to monitor disease progression in SIV-infected macaques. This study was aimed at optimizing of performance characteristics of the quantitative PCR (qPCR) PVL assay. Methods The PVL quantification procedure was optimized by inclusion of an exogenous control Hepatitis C Virus armored RNA (aRNA), a plasma concentration step, extended digestion with proteinase K, and a second RNA elution step. Efficiency of viral RNA (vRNA) extraction was compared using several commercial vRNA extraction kits. Various parameters of qPCR targeting the gag region of SIVmac239, SIVsmE660 and the LTR region of SIVagmSAB were also optimized. Results Modifications of the SIV PVL qPCR procedure increased vRNA recovery, reduced inhibition and improved analytical sensitivity. The PVL values determined by this SIV PVL qPCR correlated with quantification results of SIV-RNA in the same samples using the “industry standard” method of branched-DNA (bDNA) signal amplification. Conclusions Quantification of SIV genomic RNA in plasma of rhesus macaques using this optimized SIV PVL qPCR is equivalent to the bDNA signal amplification method, less costly and more versatile. Use of heterologous aRNA as an internal control is useful for optimizing performance characteristics of PVL qPCRs. PMID:24266615

  11. Simultaneous quantification of Δ9-tetrahydrocannabinol, 11-hydroxy-Δ9-tetrahydrocannabinol, and 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid in human plasma using two-dimensional gas chromatography, cryofocusing, and electron impact-mass spectrometry

    PubMed Central

    Lowe, Ross H.; Karschner, Erin L.; Schwilke, Eugene W.; Barnes, Allan J.; Huestis, Marilyn A.

    2009-01-01

    A two-dimensional (2D) gas chromatography/electron impact-mass spectrometry (GC/EI-MS) method for simultaneous quantification of Δ9-tetrahydrocannabinol (THC), 11-hydroxy-Δ9-tetrahydrocannabinol (11-OH-THC), and 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid (THCCOOH) in human plasma was developed and validated. The method employs 2D capillary GC and cryofocusing for enhanced resolution and sensitivity. THC, 11-OH-THC, and THCCOOH were extracted by precipitation with acetonitrile followed by solid-phase extraction. GC separation of trimethylsilyl derivatives of analytes was accomplished with two capillary columns in series coupled via a pneumatic Deans switch system. Detection and quantification were accomplished with a bench-top single quadrupole mass spectrometer operated in electron impact-selected ion monitoring mode. Limits of quantification (LOQ) were 0.125, 0.25 and 0.125 ng/mL for THC, 11-OH-THC, and THCCOOH, respectively. Accuracy ranged from 86.0 to 113.0% for all analytes. Intra- and inter-assay precision, as percent relative standard deviation, was less than 14.1% for THC, 11-OH-THC, and THCCOOH. The method was successfully applied to quantification of THC and its 11-OH-THC and THCCOOH metabolites in plasma specimens following controlled administration of THC. PMID:17640656

  12. Quantification of DNA using the luminescent oxygen channeling assay.

    PubMed

    Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S

    2000-09-01

    Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.

  13. Quantification of free fatty acids in human stratum corneum using tandem mass spectrometry and surrogate analyte approach.

    PubMed

    Dapic, Irena; Kobetic, Renata; Brkljacic, Lidija; Kezic, Sanja; Jakasa, Ivone

    2018-02-01

    The free fatty acids (FFAs) are one of the major components of the lipids in the stratum corneum (SC), the uppermost layer of the skin. Relative composition of FFAs has been proposed as a biomarker of the skin barrier status in patients with atopic dermatitis (AD). Here, we developed an LC-ESI-MS/MS method for simultaneous quantification of a range of FFAs with long and very long chain length in the SC collected by adhesive tape (D-Squame). The method, based on derivatization with 2-bromo-1-methylpyridinium iodide and 3-carbinol-1-methylpyridinium iodide, allowed highly sensitive detection and quantification of FFAs using multiple reaction monitoring. For the quantification, we applied a surrogate analyte approach and internal standardization using isotope labeled derivatives of FFAs. Adhesive tapes showed the presence of several FFAs, which are also present in the SC, a problem encountered in previous studies. Therefore, the levels of FFAs in the SC were corrected using C12:0, which was present on the adhesive tape, but not detected in the SC. The method was applied to SC samples from patients with atopic dermatitis and healthy subjects. Quantification using multiple reaction monitoring allowed sufficient sensitivity to analyze FFAs of chain lengths C16-C28 in the SC collected on only one tape strip. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Ranking Fragment Ions Based on Outlier Detection for Improved Label-Free Quantification in Data-Independent Acquisition LC-MS/MS

    PubMed Central

    Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard

    2016-01-01

    Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574

  15. A method to characterize the roughness of 2-D line features: recrystallization boundaries.

    PubMed

    Sun, J; Zhang, Y B; Dahl, A B; Conradsen, K; Juul Jensen, D

    2017-03-01

    A method is presented, which allows quantification of the roughness of nonplanar boundaries of objects for which the neutral plane is not known. The method provides quantitative descriptions of both the local and global characteristics. How the method can be used to estimate the sizes of rough features and local curvatures is also presented. The potential of the method is illustrated by quantification of the roughness of two recrystallization boundaries in a pure Al specimen characterized by scanning electron microscopy. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  16. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Development and validation of an ultra high performance liquid chromatography-electrospray tandem mass spectrometry method using selective derivatisation, for the quantification of two reactive aldehydes produced by lipid peroxidation, HNE (4-hydroxy-2(E)-nonenal) and HHE (4-hydroxy-2(E)-hexenal) in faecal water.

    PubMed

    Chevolleau, S; Noguer-Meireles, M-H; Jouanin, I; Naud, N; Pierre, F; Gueraud, F; Debrauwer, L

    2018-04-15

    Red or processed meat rich diets have been shown to be associated with an elevated risk of colorectal cancer (CRC). One major hypothesis involves dietary heme iron which induces lipid peroxidation. The quantification of the resulting reactive aldehydes (e.g. HNE and HHE) in the colon lumen is therefore of great concern since these compounds are known for their cytotoxic and genotoxic properties. UHPLC-ESI-MS/MS method has been developed and validated for HNE and HHE quantification in rat faeces. Samples were derivatised using a brominated reagent (BBHA) in presence of pre-synthesized deuterated internal standards (HNE-d11/HHE-d5), extracted by solid phase extraction, and then analysed by LC-positive ESI-MS/MS (MRM) on a TSQ Vantage mass spectrometer. The use of BBHA allowed the efficient stabilisation of the unstable and reactive hydroxy-alkenals HNE and HHE. The MRM method allowed selective detection of HNE and HHE on the basis of characteristic transitions monitored from both the 79 and 81 bromine isotopic peaks. This method was validated according to the European Medicines Agency (EMEA) guidelines, by determining selectivity, sensitivity, linearity, carry-over effect, recovery, matrix effect, repeatability, trueness and intermediate precision. The performance of the method enabled the quantification of HNE and HHE in concentrations 0.10-0.15 μM in faecal water. Results are presented on the application to the quantification of HNE and HHE in different faecal waters obtained from faeces of rats fed diets with various fatty acid compositions thus corresponding to different pro-oxidative features. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Development and validation of a bioanalytical LC-MS method for the quantification of GHRP-6 in human plasma.

    PubMed

    Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier

    2012-02-23

    Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Quantification of susceptibility change at high-concentrated SPIO-labeled target by characteristic phase gradient recognition.

    PubMed

    Zhu, Haitao; Nie, Binbin; Liu, Hua; Guo, Hua; Demachi, Kazuyuki; Sekino, Masaki; Shan, Baoci

    2016-05-01

    Phase map cross-correlation detection and quantification may produce highlighted signal at superparamagnetic iron oxide nanoparticles, and distinguish them from other hypointensities. The method may quantify susceptibility change by performing least squares analysis between a theoretically generated magnetic field template and an experimentally scanned phase image. Because characteristic phase recognition requires the removal of phase wrap and phase background, additional steps of phase unwrapping and filtering may increase the chance of computing error and enlarge the inconsistence among algorithms. To solve problem, phase gradient cross-correlation and quantification method is developed by recognizing characteristic phase gradient pattern instead of phase image because phase gradient operation inherently includes unwrapping and filtering functions. However, few studies have mentioned the detectable limit of currently used phase gradient calculation algorithms. The limit may lead to an underestimation of large magnetic susceptibility change caused by high-concentrated iron accumulation. In this study, mathematical derivation points out the value of maximum detectable phase gradient calculated by differential chain algorithm in both spatial and Fourier domain. To break through the limit, a modified quantification method is proposed by using unwrapped forward differentiation for phase gradient generation. The method enlarges the detectable range of phase gradient measurement and avoids the underestimation of magnetic susceptibility. Simulation and phantom experiments were used to quantitatively compare different methods. In vivo application performs MRI scanning on nude mice implanted by iron-labeled human cancer cells. Results validate the limit of detectable phase gradient and the consequent susceptibility underestimation. Results also demonstrate the advantage of unwrapped forward differentiation compared with differential chain algorithms for susceptibility quantification at high-concentrated iron accumulation. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Rapid quantification of vesicle concentration for DOPG/DOPC and Cardiolipin/DOPC mixed lipid systems of variable composition.

    PubMed

    Elmer-Dixon, Margaret M; Bowler, Bruce E

    2018-05-19

    A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.

  1. Colorimetric protein determination in microalgae (Chlorophyta): association of milling and SDS treatment for total protein extraction.

    PubMed

    Mota, Maria Fernanda S; Souza, Marcella F; Bon, Elba P S; Rodrigues, Marcoaurelio A; Freitas, Suely Pereira

    2018-05-24

    The use of colorimetric methods for protein quantification in microalgae is hindered by their elevated amounts of membrane-embedded intracellular proteins. In this work, the protein content of three species of microalgae was determined by the Lowry method after the cells were dried, ball-milled, and treated with the detergent sodium dodecyl sulfate (SDS). Results demonstrated that the association of milling and SDS treatment resulted in a 3- to 7-fold increase in protein quantification. Milling promoted microalgal disaggregation and cell wall disruption enabling access of the SDS detergent to the microalgal intracellular membrane proteins and their efficient solubilization and quantification. © 2018 Phycological Society of America.

  2. GC-MS quantification of suspected volatile allergens in fragrances. 2. Data treatment strategies and method performances.

    PubMed

    Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias

    2007-01-10

    The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.

  3. High-performance Thin-layer Chromatographic-densitometric Quantification and Recovery of Bioactive Compounds for Identification of Elite Chemotypes of Gloriosa superba L. Collected from Sikkim Himalayas (India).

    PubMed

    Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md; Singh Rawat, Ajay Kumar; Srivastava, Sharad

    2017-10-01

    Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λ max of 350 nm. Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine ( R f : 0.72) and gloriosine ( R f : 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100-400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers.

  4. Development and validation of an LC-ESI-MS/MS method for the quantification of D-84, reboxetine and citalopram for their use in MS Binding Assays addressing the monoamine transporters hDAT, hSERT and hNET.

    PubMed

    Neiens, Patrick; De Simone, Angela; Ramershoven, Anna; Höfner, Georg; Allmendinger, Lars; Wanner, Klaus T

    2018-03-03

    MS Binding Assays represent a label-free alternative to radioligand binding assays. In this study, we present an LC-ESI-MS/MS method for the quantification of (R,R)-4-(2-benzhydryloxyethyl)-1-(4-fluorobenzyl)piperidin-3-ol [(R,R)-D-84, (R,R)-1], (S,S)-reboxetine [(S,S)-2], and (S)-citalopram [(S)-3] employed as highly selective nonlabeled reporter ligands in MS Binding Assays addressing the dopamine [DAT, (R,R)-D-84], norepinephrine [NET, (S,S)-reboxetine] and serotonin transporter [SERT, (S)-citalopram], respectively. The developed LC-ESI-MS/MS method uses a pentafluorphenyl stationary phase in combination with a mobile phase composed of acetonitrile and ammonium formate buffer for chromatography and a triple quadrupole mass spectrometer in the multiple reaction monitoring mode for mass spectrometric detection. Quantification is based on deuterated derivatives of all three analytes serving as internal standards. The established LC-ESI-MS/MS method enables fast, robust, selective and highly sensitive quantification of all three reporter ligands in a single chromatographic run. The method was validated according to the Center for Drug Evaluation and Research (CDER) guideline for bioanalytical method validation regarding selectivity, accuracy, precision, calibration curve and sensitivity. Finally, filtration-based MS Binding Assays were performed for all three monoamine transporters based on this LC-ESI-MS/MS quantification method as read out. The affinities determined in saturation experiments for (R,R)-D-84 toward hDAT, for (S,S)-reboxetine toward hNET, and for (S)-citalopram toward hSERT, respectively, were in good accordance with results from literature, clearly demonstrating that the established MS Binding Assays have the potential to be an efficient alternative to radioligand binding assays widely used for this purpose so far. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Impact of Nuclear Data Uncertainties on Advanced Fuel Cycles and their Irradiated Fuel - a Comparison between Libraries

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2014-04-01

    The uncertainties on the isotopic composition throughout the burnup due to the nuclear data uncertainties are analysed. The different sources of uncertainties: decay data, fission yield and cross sections; are propagated individually, and their effect assessed. Two applications are studied: EFIT (an ADS-like reactor) and ESFR (Sodium Fast Reactor). The impact of the uncertainties on cross sections provided by the EAF-2010, SCALE6.1 and COMMARA-2.0 libraries are compared. These Uncertainty Quantification (UQ) studies have been carried out with a Monte Carlo sampling approach implemented in the depletion/activation code ACAB. Such implementation has been improved to overcome depletion/activation problems with variations of the neutron spectrum.

  6. Quantitative 31P NMR for Simultaneous Trace Analysis of Organophosphorus Pesticides in Aqueous Media Using the Stir Bar Sorptive Extraction Method

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Talebpour, Z.; Molaabasi, F.; Bijanzadeh, H. R.; Khazaeli, S.

    2016-09-01

    The analysis of pesticides in water samples is of primary concern for quality control laboratories due to the toxicity of these compounds and their associated public health risk. A novel analytical method based on stir bar sorptive extraction (SBSE), followed by 31P quantitative nuclear magnetic resonance (31P QNMR), has been developed for simultaneously monitoring and determining four organophosphorus pesticides (OPPs) in aqueous media. The effects of factors on the extraction efficiency of OPPs were investigated using a Draper-Lin small composite design. An optimal sample volume of 4.2 mL, extraction time of 96 min, extraction temperature of 42°C, and desorption time of 11 min were obtained. The results showed reasonable linearity ranges for all pesticides with correlation coefficients greater than 0.9920. The limit of quantification (LOQ) ranged from 0.1 to 2.60 mg/L, and the recoveries of spiked river water samples were from 82 to 94% with relative standard deviation (RSD) values less than 4%. The results show that this method is simple, selective, rapid, and can be applied to other sample matrices.

  7. Simultaneous determination of sucralose and related compounds by high-performance liquid chromatography with evaporative light scattering detection.

    PubMed

    Yan, Wenwu; Wang, Nani; Zhang, Peimin; Zhang, Jiajie; Wu, Shuchao; Zhu, Yan

    2016-08-01

    Sucralose is widely used in food and beverages as sweetener. Current synthesis approaches typically provide sucralose products with varying levels of related chlorinated carbohydrates which can affect the taste and flavor-modifying properties of sucralose. Quantification of related compounds in sucralose is often hampered by the lack of commercially available standards. In this work, nine related compounds were purified (purity>97%) and identified by liquid chromatography-mass spectrometry (LC-MS) and nuclear magnetic resonance (NMR), then a rapid and simple HPLC coupled with evaporative light scattering detection (ELSD) method has been developed for the simultaneous determination of sucralose and related compounds. Under optimized conditions, the method showed good linearity in the range of 2-600μgmL(-1) with determination coefficients R(2)⩾0.9990. Moreover, low limits of detection in the range of 0.5-2.0μgmL(-1) and good repeatability (RSD<3%, n=6) were obtained. Recoveries were from 96.8% to 101.2%. Finally, the method has been successfully applied to sucralose quality control and purification process monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Quantitative Serum Nuclear Magnetic Resonance Metabolomics in Large-Scale Epidemiology: A Primer on -Omic Technologies

    PubMed Central

    Kangas, Antti J; Soininen, Pasi; Lawlor, Debbie A; Davey Smith, George; Ala-Korpela, Mika

    2017-01-01

    Abstract Detailed metabolic profiling in large-scale epidemiologic studies has uncovered novel biomarkers for cardiometabolic diseases and clarified the molecular associations of established risk factors. A quantitative metabolomics platform based on nuclear magnetic resonance spectroscopy has found widespread use, already profiling over 400,000 blood samples. Over 200 metabolic measures are quantified per sample; in addition to many biomarkers routinely used in epidemiology, the method simultaneously provides fine-grained lipoprotein subclass profiling and quantification of circulating fatty acids, amino acids, gluconeogenesis-related metabolites, and many other molecules from multiple metabolic pathways. Here we focus on applications of magnetic resonance metabolomics for quantifying circulating biomarkers in large-scale epidemiology. We highlight the molecular characterization of risk factors, use of Mendelian randomization, and the key issues of study design and analyses of metabolic profiling for epidemiology. We also detail how integration of metabolic profiling data with genetics can enhance drug development. We discuss why quantitative metabolic profiling is becoming widespread in epidemiology and biobanking. Although large-scale applications of metabolic profiling are still novel, it seems likely that comprehensive biomarker data will contribute to etiologic understanding of various diseases and abilities to predict disease risks, with the potential to translate into multiple clinical settings. PMID:29106475

  9. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  10. A Versatile Cell Death Screening Assay Using Dye-Stained Cells and Multivariate Image Analysis.

    PubMed

    Collins, Tony J; Ylanko, Jarkko; Geng, Fei; Andrews, David W

    2015-11-01

    A novel dye-based method for measuring cell death in image-based screens is presented. Unlike conventional high- and medium-throughput cell death assays that measure only one form of cell death accurately, using multivariate analysis of micrographs of cells stained with the inexpensive mix, red dye nonyl acridine orange, and a nuclear stain, it was possible to quantify cell death induced by a variety of different agonists even without a positive control. Surprisingly, using a single known cytotoxic agent as a positive control for training a multivariate classifier allowed accurate quantification of cytotoxicity for mechanistically unrelated compounds enabling generation of dose-response curves. Comparison with low throughput biochemical methods suggested that cell death was accurately distinguished from cell stress induced by low concentrations of the bioactive compounds Tunicamycin and Brefeldin A. High-throughput image-based format analyses of more than 300 kinase inhibitors correctly identified 11 as cytotoxic with only 1 false positive. The simplicity and robustness of this dye-based assay makes it particularly suited to live cell screening for toxic compounds.

  11. A Versatile Cell Death Screening Assay Using Dye-Stained Cells and Multivariate Image Analysis

    PubMed Central

    Collins, Tony J.; Ylanko, Jarkko; Geng, Fei

    2015-01-01

    Abstract A novel dye-based method for measuring cell death in image-based screens is presented. Unlike conventional high- and medium-throughput cell death assays that measure only one form of cell death accurately, using multivariate analysis of micrographs of cells stained with the inexpensive mix, red dye nonyl acridine orange, and a nuclear stain, it was possible to quantify cell death induced by a variety of different agonists even without a positive control. Surprisingly, using a single known cytotoxic agent as a positive control for training a multivariate classifier allowed accurate quantification of cytotoxicity for mechanistically unrelated compounds enabling generation of dose–response curves. Comparison with low throughput biochemical methods suggested that cell death was accurately distinguished from cell stress induced by low concentrations of the bioactive compounds Tunicamycin and Brefeldin A. High-throughput image-based format analyses of more than 300 kinase inhibitors correctly identified 11 as cytotoxic with only 1 false positive. The simplicity and robustness of this dye-based assay makes it particularly suited to live cell screening for toxic compounds. PMID:26422066

  12. A scoring metric for multivariate data for reproducibility analysis using chemometric methods

    PubMed Central

    Sheen, David A.; de Carvalho Rocha, Werickson Fortunato; Lippa, Katrice A.; Bearden, Daniel W.

    2017-01-01

    Process quality control and reproducibility in emerging measurement fields such as metabolomics is normally assured by interlaboratory comparison testing. As a part of this testing process, spectral features from a spectroscopic method such as nuclear magnetic resonance (NMR) spectroscopy are attributed to particular analytes within a mixture, and it is the metabolite concentrations that are returned for comparison between laboratories. However, data quality may also be assessed directly by using binned spectral data before the time-consuming identification and quantification. Use of the binned spectra has some advantages, including preserving information about trace constituents and enabling identification of process difficulties. In this paper, we demonstrate the use of binned NMR spectra to conduct a detailed interlaboratory comparison and composition analysis. Spectra of synthetic and biologically-obtained metabolite mixtures, taken from a previous interlaboratory study, are compared with cluster analysis using a variety of distance and entropy metrics. The individual measurements are then evaluated based on where they fall within their clusters, and a laboratory-level scoring metric is developed, which provides an assessment of each laboratory’s individual performance. PMID:28694553

  13. Determination of statin drugs in hospital effluent with dispersive liquid-liquid microextraction and quantification by liquid chromatography.

    PubMed

    Martins, Ayrton F; Frank, Carla da S; Altissimo, Joseline; de Oliveira, Júlia A; da Silva, Daiane S; Reichert, Jaqueline F; Souza, Darliana M

    2017-08-24

    Statins are classified as being amongst the most prescribed agents for treating hypercholesterolaemia and preventing vascular diseases. In this study, a rapid and effective liquid chromatography method, assisted by diode array detection, was designed and validated for the simultaneous quantification of atorvastatin (ATO) and simvastatin (SIM) in hospital effluent samples. The solid phase extraction (SPE) of the analytes was optimized regarding sorbent material and pH, and the dispersive liquid-liquid microextraction (DLLME), in terms of pH, ionic strength, type and volume of extractor/dispersor solvents. The performance of both extraction procedures was evaluated in terms of linearity, quantification limits, accuracy (recovery %), precision and matrix effects for each analyte. The methods proved to be linear in the concentration range considered; the quantification limits were 0.45 µg L -1 for ATO and 0.75 µg L -1 for SIM; the matrix effect was almost absent in both methods and the average recoveries remained between 81.5-90.0%; and the RSD values were <20%. The validated methods were applied to the quantification of the statins in real samples of hospital effluent; the concentrations ranged from 18.8 µg L -1 to 35.3 µg L -1 for ATO, and from 30.3 µg L -1 to 38.5 µg L -1 for SIM. Since the calculated risk quotient was ≤192, the occurrence of ATO and SIM in hospital effluent poses a potential serious risk to human health and the aquatic ecosystem.

  14. Quantitative microscopy uncovers ploidy changes during mitosis in live Drosophila embryos and their effect on nuclear size.

    PubMed

    Puah, Wee Choo; Chinta, Rambabu; Wasser, Martin

    2017-03-15

    Time-lapse microscopy is a powerful tool to investigate cellular and developmental dynamics. In Drosophila melanogaster , it can be used to study division cycles in embryogenesis. To obtain quantitative information from 3D time-lapse data and track proliferating nuclei from the syncytial stage until gastrulation, we developed an image analysis pipeline consisting of nuclear segmentation, tracking, annotation and quantification. Image analysis of maternal-haploid ( mh ) embryos revealed that a fraction of haploid syncytial nuclei fused to give rise to nuclei of higher ploidy (2n, 3n, 4n). Moreover, nuclear densities in mh embryos at the mid-blastula transition varied over threefold. By tracking synchronized nuclei of different karyotypes side-by-side, we show that DNA content determines nuclear growth rate and size in early interphase, while the nuclear to cytoplasmic ratio constrains nuclear growth during late interphase. mh encodes the Drosophila ortholog of human Spartan, a protein involved in DNA damage tolerance. To explore the link between mh and chromosome instability, we fluorescently tagged Mh protein to study its subcellular localization. We show Mh-mKO2 localizes to nuclear speckles that increase in numbers as nuclei expand in interphase. In summary, quantitative microscopy can provide new insights into well-studied genes and biological processes. © 2017. Published by The Company of Biologists Ltd.

  15. Quantification of biofilm in microtiter plates: overview of testing conditions and practical recommendations for assessment of biofilm production by staphylococci.

    PubMed

    Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip

    2007-08-01

    The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.

  16. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    PubMed Central

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  17. Localized 2D COSY sequences: Method and experimental evaluation for a whole metabolite quantification approach

    NASA Astrophysics Data System (ADS)

    Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène

    2015-11-01

    Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.

  18. Quantification of intestinal bacterial populations by real-time PCR with a universal primer set and minor groove binder probes: a global approach to the enteric flora.

    PubMed

    Ott, Stephan J; Musfeldt, Meike; Ullmann, Uwe; Hampe, Jochen; Schreiber, Stefan

    2004-06-01

    The composition of the human intestinal flora is important for the health status of the host. The global composition and the presence of specific pathogens are relevant to the effects of the flora. Therefore, accurate quantification of all major bacterial populations of the enteric flora is needed. A TaqMan real-time PCR-based method for the quantification of 20 dominant bacterial species and groups of the intestinal flora has been established on the basis of 16S ribosomal DNA taxonomy. A PCR with conserved primers was used for all reactions. In each real-time PCR, a universal probe for quantification of total bacteria and a specific probe for the species in question were included. PCR with conserved primers and the universal probe for total bacteria allowed relative and absolute quantification. Minor groove binder probes increased the sensitivity of the assays 10- to 100-fold. The method was evaluated by cross-reaction experiments and quantification of bacteria in complex clinical samples from healthy patients. A sensitivity of 10(1) to 10(3) bacterial cells per sample was achieved. No significant cross-reaction was observed. The real-time PCR assays presented may facilitate understanding of the intestinal bacterial flora through a normalized global estimation of the major contributing species.

  19. Simultaneous quantification of cholesterol sulfate, androgen sulfates, and progestagen sulfates in human serum by LC-MS/MS[S

    PubMed Central

    Sánchez-Guijo, Alberto; Oji, Vinzenz; Hartmann, Michaela F.; Traupe, Heiko; Wudy, Stefan A.

    2015-01-01

    Steroids are primarily present in human fluids in their sulfated forms. Profiling of these compounds is important from both diagnostic and physiological points of view. Here, we present a novel method for the quantification of 11 intact steroid sulfates in human serum by LC-MS/MS. The compounds analyzed in our method, some of which are quantified for the first time in blood, include cholesterol sulfate, pregnenolone sulfate, 17-hydroxy-pregnenolone sulfate, 16-α-hydroxy-dehydroepiandrosterone sulfate, dehydroepiandrosterone sulfate, androstenediol sulfate, androsterone sulfate, epiandrosterone sulfate, testosterone sulfate, epitestosterone sulfate, and dihydrotestosterone sulfate. The assay was conceived to quantify sulfated steroids in a broad range of concentrations, requiring only 300 μl of serum. The method has been validated and its performance was studied at three quality controls, selected for each compound according to its physiological concentration. The assay showed good linearity (R2 > 0.99) and recovery for all the compounds, with limits of quantification ranging between 1 and 80 ng/ml. Averaged intra-day and between-day precisions (coefficient of variation) and accuracies (relative errors) were below 10%. The method has been successfully applied to study the sulfated steroidome in diseases such as steroid sulfatase deficiency, proving its diagnostic value. This is, to our best knowledge, the most comprehensive method available for the quantification of sulfated steroids in human blood. PMID:26239050

  20. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  1. miR-MaGiC improves quantification accuracy for small RNA-seq.

    PubMed

    Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina

    2018-05-15

    Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.

  2. Ct shift: A novel and accurate real-time PCR quantification model for direct comparison of different nucleic acid sequences and its application for transposon quantifications.

    PubMed

    Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I

    2017-01-20

    There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Methods to Detect Nitric Oxide and its Metabolites in Biological Samples

    PubMed Central

    Bryan, Nathan S.; Grisham, Matthew B.

    2007-01-01

    Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129

  4. MRI-based methods for quantification of the cerebral metabolic rate of oxygen

    PubMed Central

    Rodgers, Zachary B; Detre, John A

    2016-01-01

    The brain depends almost entirely on oxidative metabolism to meet its significant energy requirements. As such, the cerebral metabolic rate of oxygen (CMRO2) represents a key measure of brain function. Quantification of CMRO2 has helped elucidate brain functional physiology and holds potential as a clinical tool for evaluating neurological disorders including stroke, brain tumors, Alzheimer’s disease, and obstructive sleep apnea. In recent years, a variety of magnetic resonance imaging (MRI)-based CMRO2 quantification methods have emerged. Unlike positron emission tomography – the current “gold standard” for measurement and mapping of CMRO2 – MRI is non-invasive, relatively inexpensive, and ubiquitously available in modern medical centers. All MRI-based CMRO2 methods are based on modeling the effect of paramagnetic deoxyhemoglobin on the magnetic resonance signal. The various methods can be classified in terms of the MRI contrast mechanism used to quantify CMRO2: T2*, T2′, T2, or magnetic susceptibility. This review article provides an overview of MRI-based CMRO2 quantification techniques. After a brief historical discussion motivating the need for improved CMRO2 methodology, current state-of-the-art MRI-based methods are critically appraised in terms of their respective tradeoffs between spatial resolution, temporal resolution, and robustness, all of critical importance given the spatially heterogeneous and temporally dynamic nature of brain energy requirements. PMID:27089912

  5. Identification of spectral regions for the quantification of red wine tannins with fourier transform mid-infrared spectroscopy.

    PubMed

    Jensen, Jacob S; Egebo, Max; Meyer, Anne S

    2008-05-28

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.

  6. Quantitative interference by cysteine and N-acetylcysteine metabolites during the LC-MS/MS bioanalysis of a small molecule.

    PubMed

    Barricklow, Jason; Ryder, Tim F; Furlong, Michael T

    2009-08-01

    During LC-MS/MS quantification of a small molecule in human urine samples from a clinical study, an unexpected peak was observed to nearly co-elute with the analyte of interest in many study samples. Improved chromatographic resolution revealed the presence of at least 3 non-analyte peaks, which were identified as cysteine metabolites and N-acetyl (mercapturic acid) derivatives thereof. These metabolites produced artifact responses in the parent compound MRM channel due to decomposition in the ionization source of the mass spectrometer. Quantitative comparison of the analyte concentrations in study samples using the original chromatographic method and the improved chromatographic separation method demonstrated that the original method substantially over-estimated the analyte concentration in many cases. The substitution of electrospray ionization (ESI) for atmospheric pressure chemical ionization (APCI) nearly eliminated the source instability of these metabolites, which would have mitigated their interference in the quantification of the analyte, even without chromatographic separation. These results 1) demonstrate the potential for thiol metabolite interferences during the quantification of small molecules in pharmacokinetic samples, and 2) underscore the need to carefully evaluate LC-MS/MS methods for molecules that can undergo metabolism to thiol adducts to ensure that they are not susceptible to such interferences during quantification.

  7. Quantifying errors without random sampling.

    PubMed

    Phillips, Carl V; LaPole, Luwanna M

    2003-06-12

    All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.

  8. Validated Method for the Quantification of Baclofen in Human Plasma Using Solid-Phase Extraction and Liquid Chromatography-Tandem Mass Spectrometry.

    PubMed

    Nahar, Limon Khatun; Cordero, Rosa Elena; Nutt, David; Lingford-Hughes, Anne; Turton, Samuel; Durant, Claire; Wilson, Sue; Paterson, Sue

    2016-03-01

    A highly sensitive and fully validated method was developed for the quantification of baclofen in human plasma. After adjusting the pH of the plasma samples using a phosphate buffer solution (pH 4), baclofen was purified using mixed mode (C8/cation exchange) solid-phase extraction (SPE) cartridges. Endogenous water-soluble compounds and lipids were removed from the cartridges before the samples were eluted and concentrated. The samples were analyzed using triple-quadrupole liquid chromatography-tandem mass spectrometry (LC-MS-MS) with triggered dynamic multiple reaction monitoring mode for simultaneous quantification and confirmation. The assay was linear from 25 to 1,000 ng/mL (r(2) > 0.999; n = 6). Intraday (n = 6) and interday (n = 15) imprecisions (% relative standard deviation) were <5%, and the average recovery was 30%. The limit of detection of the method was 5 ng/mL, and the limit of quantification was 25 ng/mL. Plasma samples from healthy male volunteers (n = 9, median age: 22) given two single oral doses of baclofen (10 and 60 mg) on nonconsecutive days were analyzed to demonstrate method applicability. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. A novel quadruplex real-time PCR method for simultaneous detection of Cry2Ae and two genetically modified cotton events (GHB119 and T304-40).

    PubMed

    Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen

    2014-05-16

    To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.

  10. A novel quadruplex real-time PCR method for simultaneous detection of Cry2Ae and two genetically modified cotton events (GHB119 and T304-40)

    PubMed Central

    2014-01-01

    Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946

  11. In vivo quantification of amyloid burden in TTR-related cardiac amyloidosis

    PubMed Central

    Kollikowski, Alexander Marco; Kahles, Florian; Kintsler, Svetlana; Hamada, Sandra; Reith, Sebastian; Knüchel, Ruth; Röcken, Christoph; Mottaghy, Felix Manuel; Marx, Nikolaus; Burgmaier, Mathias

    2017-01-01

    Summary Cardiac transthyretin-related (ATTR) amyloidosis is a severe cardiomyopathy for which therapeutic approaches are currently under development. Because non-invasive imaging techniques such as cardiac magnetic resonance imaging and echocardiography are non-specific, the diagnosis of ATTR amyloidosis is still based on myocardial biopsy. Thus, diagnosis of ATTR amyloidosis is difficult in patients refusing myocardial biopsy. Furthermore, myocardial biopsy does not allow 3D-mapping and quantification of myocardial ATTR amyloid. In this report we describe a 99mTc-DPD-based molecular imaging technique for non-invasive single-step diagnosis, three-dimensional mapping and semiquantification of cardiac ATTR amyloidosis in a patient with suspected amyloid heart disease who initially rejected myocardial biopsy. This report underlines the clinical value of SPECT-based nuclear medicine imaging to enable non-invasive diagnosis of cardiac ATTR amyloidosis, particularly in patients rejecting biopsy. PMID:29259858

  12. Raman spectroscopy for DNA quantification in cell nucleus.

    PubMed

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate. © 2014 International Society for Advancement of Cytometry.

  13. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  14. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  15. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  16. Multiplex quantification of protein toxins in human biofluids and food matrices using immunoextraction and high-resolution targeted mass spectrometry.

    PubMed

    Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François

    2015-08-18

    The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.

  17. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  18. Detection and Quantification of Human Fecal Pollution with Real-Time PCR

    EPA Science Inventory

    ABSTRACT Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described ...

  19. The effect of applied transducer force on acoustic radiation force impulse quantification within the left lobe of the liver.

    PubMed

    Porra, Luke; Swan, Hans; Ho, Chien

    2015-08-01

    Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.

  20. Accuracy of Rhenium-188 SPECT/CT activity quantification for applications in radionuclide therapy using clinical reconstruction methods.

    PubMed

    Esquinas, Pedro L; Uribe, Carlos F; Gonzalez, M; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna

    2017-07-20

    The main applications of 188 Re in radionuclide therapies include trans-arterial liver radioembolization and palliation of painful bone-metastases. In order to optimize 188 Re therapies, the accurate determination of radiation dose delivered to tumors and organs at risk is required. Single photon emission computed tomography (SPECT) can be used to perform such dosimetry calculations. However, the accuracy of dosimetry estimates strongly depends on the accuracy of activity quantification in 188 Re images. In this study, we performed a series of phantom experiments aiming to investigate the accuracy of activity quantification for 188 Re SPECT using high-energy and medium-energy collimators. Objects of different shapes and sizes were scanned in Air, non-radioactive water (Cold-water) and water with activity (Hot-water). The ordered subset expectation maximization algorithm with clinically available corrections (CT-based attenuation, triple-energy window (TEW) scatter and resolution recovery was used). For high activities, the dead-time corrections were applied. The accuracy of activity quantification was evaluated using the ratio of the reconstructed activity in each object to this object's true activity. Each object's activity was determined with three segmentation methods: a 1% fixed threshold (for cold background), a 40% fixed threshold and a CT-based segmentation. Additionally, the activity recovered in the entire phantom, as well as the average activity concentration of the phantom background were compared to their true values. Finally, Monte-Carlo simulations of a commercial [Formula: see text]-camera were performed to investigate the accuracy of the TEW method. Good quantification accuracy (errors  <10%) was achieved for the entire phantom, the hot-background activity concentration and for objects in cold background segmented with a 1% threshold. However, the accuracy of activity quantification for objects segmented with 40% threshold or CT-based methods decreased (errors  >15%), mostly due to partial-volume effects. The Monte-Carlo simulations confirmed that TEW-scatter correction applied to 188 Re, although practical, yields only approximate estimates of the true scatter.

  1. [Performance evaluation of a fluorescamine-HPLC method for determination of histamine in fish and fish products].

    PubMed

    Kikuchi, Hiroyuki; Tsutsumi, Tomoaki; Matsuda, Rieko

    2012-01-01

    A method for the quantification of histamine in fish and fish products using tandem solid-phase extraction and fluorescence derivatization with fluorescamine was previously developed. In this study, we improved this analytical method to develop an official test method for quantification of histamine in fish and fish products, and performed a single laboratory study to validate it. Recovery tests of histamine from fillet (Thunnus obesus), and two fish products (fish sauce and salted and dried whole big-eye sardine) that were spiked at the level of 25 and 50 µg/g for T. obesus, and 50 and 100 µg/g for the two fish products, were carried out. The recoveries of histamine from the three samples tested were 88.8-99.6% with good repeatability (1.3-2.1%) and reproducibility (2.1-4.7%). Therefore, this method is acceptable for the quantification of histamine in fish and fish products. Moreover, surveillance of histamine content in food on the market was conducted using this method, and high levels of histamine were detected in some fish products.

  2. Development and validation of a high-performance liquid chromatography method for the quantification of talazoparib in rat plasma: Application to plasma protein binding studies.

    PubMed

    Hidau, Mahendra Kumar; Kolluru, Srikanth; Palakurthi, Srinath

    2018-02-01

    A sensitive and selective RP-HPLC method has been developed and validated for the quantification of a highly potent poly ADP ribose polymerase inhibitor talazoparib (TZP) in rat plasma. Chromatographic separation was performed with isocratic elution method. Absorbance for TZP was measured with a UV detector (SPD-20A UV-vis) at a λ max of 227 nm. Protein precipitation was used to extract the drug from plasma samples using methanol-acetonitrile (65:35) as the precipitating solvent. The method proved to be sensitive and reproducible over a 100-2000 ng/mL linearity range with a lower limit of quantification (LLQC) of 100 ng/mL. TZP recovery was found to be >85%. Following analytical method development and validation, it was successfully employed to determine the plasma protein binding of TZP. TZP has a high level of protein binding in rat plasma (95.76 ± 0.38%) as determined by dialysis method. Copyright © 2017 John Wiley & Sons, Ltd.

  3. An ultra-high pressure liquid chromatography-tandem mass spectrometry method for the quantification of teicoplanin in plasma of neonates.

    PubMed

    Begou, O; Kontou, A; Raikos, N; Sarafidis, K; Roilides, E; Papadoyannis, I N; Gika, H G

    2017-03-15

    The development and validation of an ultra-high pressure liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was performed with the aim to be applied for the quantification of plasma teicoplanin concentrations in neonates. Pharmacokinetic data of teicoplanin in the neonatal population is very limited, therefore, a sensitive and reliable method for the determination of all isoforms of teicoplanin applied in a low volume of sample is of real importance. Teicoplanin main components were extracted by a simple acetonitrile precipitation step and analysed on a C18 chromatographic column by a triple quadrupole MS with electrospray ionization. The method provides quantitative data over a linear range of 25-6400ng/mL with LOD 8.5ng/mL and LOQ 25ng/mL for total teicoplanin. The method was applied in plasma samples from neonates to support pharmacokinetic data and proved to be a reliable and fast method for the quantification of teicoplanin concentration levels in plasma of infants during therapy in Intensive Care Unit. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT

    PubMed Central

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2014-01-01

    Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354

  5. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy.

    PubMed

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-11-19

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium.

  6. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy

    PubMed Central

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-01-01

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535

  7. Active Interrogation using Photofission Technique for Nuclear Materials Control and Accountability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Haori

    2016-03-31

    Innovative systems with increased sensitivity and resolution are in great demand to detect diversion and to prevent misuse in support of nuclear materials management for the U.S. fuel cycle. Nuclear fission is the most important multiplicative process involved in non-destructive active interrogation. This process produces the most easily recognizable signature for nuclear materials. In addition to thermal or high-energy neutrons, high-energy gamma rays can also excite a nucleus and cause fission through a process known as photofission. Electron linear accelerators (linacs) are widely used as the interrogating photon sources for inspection methods involving photofission technique. After photofission reactions, prompt signalsmore » are much stronger than the delayed signals, but it is difficult to quantify them in practical measurements. Delayed signals are easily distinguishable from the interrogating radiation. Linac-based, advanced inspection techniques utilizing the delayed signals after photofission have been extensively studied for homeland security applications. Previous research also showed that a unique delayed gamma ray energy spectrum exists for each fissionable isotope. In this work, high-energy delayed γ-rays were demonstrated to be signatures for detection, identification, and quantification of special nuclear materials. Such γ-rays were measured in between linac pulses using independent data acquisition systems. A list-mode system was developed to measure low-energy delayed γ-rays after irradiation. Photofission product yields of 238U and 239Pu were determined based on the measured delayed γ-ray spectra. The differential yields of delayed γ-rays were also proven to be able to discriminate nuclear from non-nuclear materials. The measurement outcomes were compared with Monte Carlo simulation results. It was demonstrated that the current available codes have capabilities and limitations in the simulation of photofission process. A two-fold approach was used to address the high-rate challenge in used nuclear fuel assay based on photofission technique. First, a standard HPGe preamplifier was modified to improve its capabilities in high-rate pulsed photofission environment. Second, advanced pulse processing algorithms were shown to greatly improve throughput rate without large sacrifice in energy resolution at ultra-high input count rate. Two customized gamma spectroscopy systems were also developed in real-time on FPGAs. They were shown to have promising performance matching available commercial units.« less

  8. European multicentre database of healthy controls for [123I]FP-CIT SPECT (ENC-DAT): age-related effects, gender differences and evaluation of different methods of analysis.

    PubMed

    Varrone, Andrea; Dickson, John C; Tossici-Bolt, Livia; Sera, Terez; Asenbaum, Susanne; Booij, Jan; Kapucu, Ozlem L; Kluge, Andreas; Knudsen, Gitte M; Koulibaly, Pierre Malick; Nobili, Flavio; Pagani, Marco; Sabri, Osama; Vander Borght, Thierry; Van Laere, Koen; Tatsch, Klaus

    2013-01-01

    Dopamine transporter (DAT) imaging with [(123)I]FP-CIT (DaTSCAN) is an established diagnostic tool in parkinsonism and dementia. Although qualitative assessment criteria are available, DAT quantification is important for research and for completion of a diagnostic evaluation. One critical aspect of quantification is the availability of normative data, considering possible age and gender effects on DAT availability. The aim of the European Normal Control Database of DaTSCAN (ENC-DAT) study was to generate a large database of [(123)I]FP-CIT SPECT scans in healthy controls. SPECT data from 139 healthy controls (74 men, 65 women; age range 20-83 years, mean 53 years) acquired in 13 different centres were included. Images were reconstructed using the ordered-subset expectation-maximization algorithm without correction (NOACSC), with attenuation correction (AC), and with both attenuation and scatter correction using the triple-energy window method (ACSC). Region-of-interest analysis was performed using the BRASS software (caudate and putamen), and the Southampton method (striatum). The outcome measure was the specific binding ratio (SBR). A significant effect of age on SBR was found for all data. Gender had a significant effect on SBR in the caudate and putamen for the NOACSC and AC data, and only in the left caudate for the ACSC data (BRASS method). Significant effects of age and gender on striatal SBR were observed for all data analysed with the Southampton method. Overall, there was a significant age-related decline in SBR of between 4 % and 6.7 % per decade. This study provides a large database of [(123)I]FP-CIT SPECT scans in healthy controls across a wide age range and with balanced gender representation. Higher DAT availability was found in women than in men. An average age-related decline in DAT availability of 5.5 % per decade was found for both genders, in agreement with previous reports. The data collected in this study may serve as a reference database for nuclear medicine centres and for clinical trials using [(123)I]FP-CIT SPECT as the imaging marker.

  9. Optimization and Verification of Droplet Digital PCR Even-Specific Methods for the Quantification of GM Maize DAS1507 and NK603.

    PubMed

    Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir

    2018-05-01

    In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.

  10. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting.

    PubMed

    Rashed-Ul Islam, S M; Jahan, Munira; Tabassum, Shahina

    2015-01-01

    Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 10 3 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 10 3 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log 10 IU/ml and limits of agreement of -1.82 to 3.03 log 10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log 10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15.

  11. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting

    PubMed Central

    Jahan, Munira; Tabassum, Shahina

    2015-01-01

    Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 103 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 103 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log10 IU/ml and limits of agreement of -1.82 to 3.03 log10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. How to cite this article Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15. PMID:29201678

  12. DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...

  13. Volatile Organic Silicon Compounds in Biogases: Development of Sampling and Analytical Methods for Total Silicon Quantification by ICP-OES

    PubMed Central

    Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick

    2014-01-01

    Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES. PMID:25379538

  14. Round robin test on quantification of amyloid-β 1-42 in cerebrospinal fluid by mass spectrometry.

    PubMed

    Pannee, Josef; Gobom, Johan; Shaw, Leslie M; Korecka, Magdalena; Chambers, Erin E; Lame, Mary; Jenkins, Rand; Mylott, William; Carrillo, Maria C; Zegers, Ingrid; Zetterberg, Henrik; Blennow, Kaj; Portelius, Erik

    2016-01-01

    Cerebrospinal fluid (CSF) amyloid-β 1-42 (Aβ42) is an important biomarker for Alzheimer's disease, both in diagnostics and to monitor disease-modifying therapies. However, there is a great need for standardization of methods used for quantification. To overcome problems associated with immunoassays, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has emerged as a critical orthogonal alternative. We compared results for CSF Aβ42 quantification in a round robin study performed in four laboratories using similar sample preparation methods and LC-MS instrumentation. The LC-MS results showed excellent correlation between laboratories (r(2) >0.98), high analytical precision, and good correlation with enzyme-linked immunosorbent assay (r(2) >0.85). The use of a common reference sample further decreased interlaboratory variation. Our results indicate that LC-MS is suitable for absolute quantification of Aβ42 in CSF and highlight the importance of developing a certified reference material. Copyright © 2016 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  15. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    PubMed

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  16. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  17. Volatile organic silicon compounds in biogases: development of sampling and analytical methods for total silicon quantification by ICP-OES.

    PubMed

    Chottier, Claire; Chatain, Vincent; Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick

    2014-01-01

    Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES.

  18. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    USGS Publications Warehouse

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  19. Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.

    PubMed

    Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2018-02-01

    Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Chiral discrimination of sibutramine enantiomers by capillary electrophoresis and proton nuclear magnetic resonance spectroscopy.

    PubMed

    Lee, Yong-Jae; Choi, Seungho; Lee, Jinhoo; Nguyen, NgocVan Thi; Lee, Kyungran; Kang, Jong Seong; Mar, Woongchon; Kim, Kyeong Ho

    2012-03-01

    Capillary electrophoresis (CE) and proton nuclear magnetic resonance spectroscopy ((1)H-NMR) have been used to discriminate the enantiomers of sibutramine using cyclodextrin derivatives. Possible correlation between CE and (1)H-NMR was examined. Good correlation between the (1)H-NMR shift non-equivalence data for sibutramine and the degree of enantioseparation in CE was observed. In CE study, a method of enantiomeric separation and quantitation of sibutramine was developed using enantiomeric standards. The method was based on the use of 50 mM of phosphate buffer of pH 3.0 with 10 mM of methyl-beta-cyclodextrin (M-β-CD). 0.05% of LOD, 0.2% of LOQ for S-sibutramine enantiomer was achieved, and the method was validated and applied to the quantitative determination of sibutramine enantiomers in commercial drugs. On a 600 MHz (1)H-NMR analysis, enantiomer signal separation of sibutramine was obtained by fast diastereomeric interaction with a chiral selector M-β-CD. For chiral separation and quantification, N-methyl proton peaks (at 2.18 ppm) were selected because of its being singlet and simple for understanding of diastereomeric interaction. Effects of temperature and concentration of chiral selector on enantiomer signal separation were investigated. The optimum condition was 0.5 mg/mL of sibutramine and 10 mg/mL of M-β-CD at 10°C. Distinguishment of 0.5% of S-sibutramine in R-sibutramine was found to be possible by (1)H-NMR with M-β-CD as chiral selector. Host-guest interaction between sibutramine and M-β-CD was confirmed by (1)H-NMR studies and CE studies. A Structure of the inclusion complex was proposed considering (1)H-NMR and 2D ROESY studies.

  1. NMR high-resolution magic angle spinning rotor design for quantification of metabolic concentrations

    NASA Astrophysics Data System (ADS)

    Holly, R.; Damyanovich, A.; Peemoeller, H.

    2006-05-01

    A new high-resolution magic angle spinning nuclear magnetic resonance technique is presented to obtain absolute metabolite concentrations of solutions. The magnetic resonance spectrum of the sample under investigation and an internal reference are acquired simultaneously, ensuring both spectra are obtained under the same experimental conditions. The robustness of the technique is demonstrated using a solution of creatine, and it is shown that the technique can obtain solution concentrations to within 7% or better.

  2. Phylogenetic Quantification of Intra-tumour Heterogeneity

    PubMed Central

    Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian

    2014-01-01

    Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184

  3. dPCR: A Technology Review

    PubMed Central

    Quan, Phenix-Lan; Sauzade, Martin

    2018-01-01

    Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144

  4. In vivo quantitative evaluation of vascular parameters for angiogenesis based on sparse principal component analysis and aggregated boosted trees

    NASA Astrophysics Data System (ADS)

    Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie

    2014-12-01

    To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.

  5. A Cyber-Attack Detection Model Based on Multivariate Analyses

    NASA Astrophysics Data System (ADS)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  6. Methods for detection of GMOs in food and feed.

    PubMed

    Marmiroli, Nelson; Maestri, Elena; Gullì, Mariolina; Malcevschi, Alessio; Peano, Clelia; Bordoni, Roberta; De Bellis, Gianluca

    2008-10-01

    This paper reviews aspects relevant to detection and quantification of genetically modified (GM) material within the feed/food chain. The GM crop regulatory framework at the international level is evaluated with reference to traceability and labelling. Current analytical methods for the detection, identification, and quantification of transgenic DNA in food and feed are reviewed. These methods include quantitative real-time PCR, multiplex PCR, and multiplex real-time PCR. Particular attention is paid to methods able to identify multiple GM events in a single reaction and to the development of microdevices and microsensors, though they have not been fully validated for application.

  7. A Demonstration of Concrete Structural Health Monitoring Framework for Degradation due to Alkali-Silica Reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Neal, Kyle

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high-confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant that is subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of fourmore » elements: monitoring, data analytics, uncertainty quantification and prognosis. This report focuses on degradation caused by ASR (alkali-silica reaction). Controlled specimens were prepared to develop accelerated ASR degradation. Different monitoring techniques – thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) -- were used to detect the damage caused by ASR. Heterogeneous data from the multiple techniques was used for damage diagnosis and prognosis, and quantification of the associated uncertainty using a Bayesian network approach. Additionally, MapReduce technique has been demonstrated with synthetic data. This technique can be used in future to handle large amounts of observation data obtained from the online monitoring of realistic structures.« less

  8. Fourier Transform Infrared Spectroscopy and Multivariate Analysis for Online Monitoring of Dibutyl Phosphate Degradation Product in Tributyl Phosphate/n-Dodecane/Nitric Acid Solvent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tatiana G. Levitskaia; James M. Peterson; Emily L. Campbell

    2013-12-01

    In liquid–liquid extraction separation processes, accumulation of organic solvent degradation products is detrimental to the process robustness, and frequent solvent analysis is warranted. Our research explores the feasibility of online monitoring of the organic solvents relevant to used nuclear fuel reprocessing. This paper describes the first phase of developing a system for monitoring the tributyl phosphate (TBP)/n-dodecane solvent commonly used to separate used nuclear fuel. In this investigation, the effect of extraction of nitric acid from aqueous solutions of variable concentrations on the quantification of TBP and its major degradation product dibutylphosphoric acid (HDBP) was assessed. Fourier transform infrared (FTIR)more » spectroscopy was used to discriminate between HDBP and TBP in the nitric acid-containing TBP/n-dodecane solvent. Multivariate analysis of the spectral data facilitated the development of regression models for HDBP and TBP quantification in real time, enabling online implementation of the monitoring system. The predictive regression models were validated using TBP/n-dodecane solvent samples subjected to high-dose external ?-irradiation. The predictive models were translated to flow conditions using a hollow fiber FTIR probe installed in a centrifugal contactor extraction apparatus, demonstrating the applicability of the FTIR technique coupled with multivariate analysis for the online monitoring of the organic solvent degradation products.« less

  9. Fourier Transform Infrared Spectroscopy and Multivariate Analysis for Online Monitoring of Dibutyl Phosphate Degradation Product in Tributyl Phosphate /n-Dodecane/Nitric Acid Solvent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levitskaia, Tatiana G.; Peterson, James M.; Campbell, Emily L.

    2013-11-05

    In liquid-liquid extraction separation processes, accumulation of organic solvent degradation products is detrimental to the process robustness and frequent solvent analysis is warranted. Our research explores feasibility of online monitoring of the organic solvents relevant to used nuclear fuel reprocessing. This paper describes the first phase of developing a system for monitoring the tributyl phosphate (TBP)/n-dodecane solvent commonly used to separate used nuclear fuel. In this investigation, the effect of extraction of nitric acid from aqueous solutions of variable concentrations on the quantification of TBP and its major degradation product dibutyl phosphoric acid (HDBP) was assessed. Fourier Transform Infrared Spectroscopymore » (FTIR) spectroscopy was used to discriminate between HDBP and TBP in the nitric acid-containing TBP/n-dodecane solvent. Multivariate analysis of the spectral data facilitated the development of regression models for HDBP and TBP quantification in real time, enabling online implementation of the monitoring system. The predictive regression models were validated using TBP/n-dodecane solvent samples subjected to the high dose external gamma irradiation. The predictive models were translated to flow conditions using a hollow fiber FTIR probe installed in a centrifugal contactor extraction apparatus demonstrating the applicability of the FTIR technique coupled with multivariate analysis for the online monitoring of the organic solvent degradation products.« less

  10. Quantitative Component Analysis of Solid Mixtures by Analyzing Time Domain 1H and 19F T1 Saturation Recovery Curves (qSRC).

    PubMed

    Stueber, Dirk; Jehle, Stefan

    2017-07-01

    Prevalent polymorphism and complicated phase behavior of active pharmaceutical ingredients (APIs) often result in remarkable differences in the respective biochemical and physical API properties. Consequently, API form characterization and quantification play a central role in the pharmaceutical industry from early drug development to manufacturing. Here we present a novel and proficient quantification protocol for solid mixtures (qSRC) based on the measurement and mathematical fitting of T 1 nuclear magnetic resonance (NMR) saturation recovery curves collected on a bench top time-domain NMR instrument. The saturation recovery curves of the relevant pure components are used as fingerprints. Employing a bench top NMR instrument possesses clear benefits. These instruments exhibit a small footprint, do not present any special requirements on lab space, and required sample handling is simple and fast. The qSRC analysis can easily be conducted in a conventional laboratory setting as well as in an industrial production environment, making it a versatile tool with the potential for widespread application. The accuracy and efficiency of the qSRC method is illustrated using 1 H and 19 F T 1 data of selected pharmaceutical model compounds, as well as utilizing 1 H T 1 data of an actual binary API anhydrous polymorph system of a Merck & Co., Inc. compound formerly developed as a hepatitis C virus drug. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  11. Total synthesis of isotopically enriched Si-29 silica NPs as potential spikes for isotope dilution quantification of natural silica NPs.

    PubMed

    Pálmai, Marcell; Szalay, Roland; Bartczak, Dorota; Varga, Zoltán; Nagy, Lívia Naszályi; Gollwitzer, Christian; Krumrey, Michael; Goenaga-Infante, Heidi

    2015-05-01

    A new method was developed for the preparation of highly monodisperse isotopically enriched Si-29 silica nanoparticles ((29)Si-silica NPs) with the purpose of using them as spikes for isotope dilution mass spectrometry (IDMS) quantification of silica NPs with natural isotopic distribution. Si-29 tetraethyl orthosilicate ((29)Si-TEOS), the silica precursor was prepared in two steps starting from elementary silicon-29 pellets. In the first step Si-29 silicon tetrachloride ((29)SiCl4) was prepared by heating elementary silicon-29 in chlorine gas stream. By using a multistep cooling system and the dilution of the volatile and moisture-sensitive (29)SiCl4 in carbon tetrachloride as inert medium we managed to reduce product loss caused by evaporation. (29)Si-TEOS was obtained by treating (29)SiCl4 with absolute ethanol. Structural characterisation of (29)Si-TEOS was performed by using (1)H and (13)C nuclear magnetic resonance (NMR) spectroscopy and Fourier-transform infrared (FTIR) spectroscopy. For the NP preparation, a basic amino acid catalysis route was used and the resulting NPs were analysed using transmission electron microscopy (TEM), small angle X-ray scattering (SAXS), dynamic light scattering (DLS) and zeta potential measurements. Finally, the feasibility of using enriched NPs for on-line field-flow fractionation coupled with multi-angle light scattering and inductively coupled plasma mass spectrometry (FFF/MALS/ICP-MS) has been demonstrated. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Development and validation of high-performance liquid chromatography and high-performance thin-layer chromatography methods for the quantification of khellin in Ammi visnaga seed

    PubMed Central

    Kamal, Abid; Khan, Washim; Ahmad, Sayeed; Ahmad, F. J.; Saleem, Kishwar

    2015-01-01

    Objective: The present study was used to design simple, accurate and sensitive reversed phase-high-performance liquid chromatography RP-HPLC and high-performance thin-layer chromatography (HPTLC) methods for the development of quantification of khellin present in the seeds of Ammi visnaga. Materials and Methods: RP-HPLC analysis was performed on a C18 column with methanol: Water (75: 25, v/v) as a mobile phase. The HPTLC method involved densitometric evaluation of khellin after resolving it on silica gel plate using ethyl acetate: Toluene: Formic acid (5.5:4.0:0.5, v/v/v) as a mobile phase. Results: The developed HPLC and HPTLC methods were validated for precision (interday, intraday and intersystem), robustness and accuracy, limit of detection and limit of quantification. The relationship between the concentration of standard solutions and the peak response was linear in both HPLC and HPTLC methods with the concentration range of 10–80 μg/mL in HPLC and 25–1,000 ng/spot in HPTLC for khellin. The % relative standard deviation values for method precision was found to be 0.63–1.97%, 0.62–2.05% in HPLC and HPTLC for khellin respectively. Accuracy of the method was checked by recovery studies conducted at three different concentration levels and the average percentage recovery was found to be 100.53% in HPLC and 100.08% in HPTLC for khellin. Conclusions: The developed HPLC and HPTLC methods for the quantification of khellin were found simple, precise, specific, sensitive and accurate which can be used for routine analysis and quality control of A. visnaga and several formulations containing it as an ingredient. PMID:26681890

  13. PCR technology for screening and quantification of genetically modified organisms (GMOs).

    PubMed

    Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G

    2003-04-01

    Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.

  14. Extraction Methodological Contributions Toward Ultra-Performance Liquid ChromatographyTime-of-Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices

    DTIC Science & Technology

    2016-02-01

    SPECTROMETRY: QUANTIFICATION OF FREE GB FROM VARIOUS FOOD MATRICES ECBC-TR-1351 Sue Y. Bae Mark D. Winemiller RESEARCH AND TECHNOLOGY DIRECTORATE...Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...methylphosphonofluoridate (sarin, GB) in various food matrices. The development of a solid-phase extraction method using a normal-phase silica gel column for

  15. Deuterium depth profile quantification in a ASDEX Upgrade divertor tile using secondary ion mass spectrometry

    NASA Astrophysics Data System (ADS)

    Ghezzi, F.; Caniello, R.; Giubertoni, D.; Bersani, M.; Hakola, A.; Mayer, M.; Rohde, V.; Anderle, M.; ASDEX Upgrade Team

    2014-10-01

    We present the results of a study where secondary ion mass spectrometry (SIMS) has been used to obtain depth profiles of deuterium concentration on plasma facing components of the first wall of the ASDEX Upgrade tokamak. The method uses primary and secondary standards to quantify the amount of deuterium retained. Samples of bulk graphite coated with tungsten or tantalum-doped tungsten are independently profiled with three different SIMS instruments. Their deuterium concentration profiles are compared showing good agreement. In order to assess the validity of the method, the integrated deuterium concentrations in the coatings given by one of the SIMS devices is compared with nuclear reaction analysis (NRA) data. Although in the case of tungsten the agreement between NRA and SIMS is satisfactory, for tantalum-doped tungsten samples the discrepancy is significant because of matrix effect induced by tantalum and differently eroded surface (W + Ta always exposed to plasma, W largely shadowed). A further comparison where the SIMS deuterium concentration is obtained by calibrating the measurements against NRA values is also presented. For the tungsten samples, where no Ta induced matrix effects are present, the two methods are almost equivalent.The results presented show the potential of the method provided that the standards used for the calibration reproduce faithfully the matrix nature of the samples.

  16. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  17. Establishing a reliable multiple reaction monitoring-based method for the quantification of obesity-associated comorbidities in serum and adipose tissue requires intensive clinical validation.

    PubMed

    Oberbach, Andreas; Schlichting, Nadine; Neuhaus, Jochen; Kullnick, Yvonne; Lehmann, Stefanie; Heinrich, Marco; Dietrich, Arne; Mohr, Friedrich Wilhelm; von Bergen, Martin; Baumann, Sven

    2014-12-05

    Multiple reaction monitoring (MRM)-based mass spectrometric quantification of peptides and their corresponding proteins has been successfully applied for biomarker validation in serum. The option of multiplexing offers the chance to analyze various proteins in parallel, which is especially important in obesity research. Here, biomarkers that reflect multiple comorbidities and allow monitoring of therapy outcomes are required. Besides the suitability of established MRM assays for serum protein quantification, it is also feasible for analysis of tissues secreting the markers of interest. Surprisingly, studies comparing MRM data sets with established methods are rare, and therefore the biological and clinical value of most analytes remains questionable. A MRM method using nano-UPLC-MS/MS for the quantification of obesity related surrogate markers for several comorbidities in serum, plasma, visceral and subcutaneous adipose tissue was established. Proteotypic peptides for complement C3, adiponectin, angiotensinogen, and plasma retinol binding protein (RBP4) were quantified using isotopic dilution analysis and compared to the standard ELISA method. MRM method variabilities were mainly below 10%. The comparison with other MS-based approaches showed a good correlation. However, large differences in absolute quantification for complement C3 and adiponectin were obtained compared to ELISA, while less marked differences were observed for angiotensinogen and RBP4. The verification of MRM in obesity was performed to discriminate first lean and obese phenotype and second to monitor excessive weight loss after gastric bypass surgery in a seven-month follow-up. The presented MRM assay was able to discriminate obese phenotype from lean and monitor weight loss related changes of surrogate markers. However, inclusion of additional biomarkers was necessary to interpret the MRM data on obesity phenotype properly. In summary, the development of disease-related MRMs should include a step of matching the MRM data with clinically approved standard methods and defining reference values in well-sized representative age, gender, and disease-matched cohorts.

  18. Quantitative Analysis of Staphylococcal Enterotoxins A and B in Food Matrices Using Ultra High-Performance Liquid Chromatography Tandem Mass Spectrometry (UPLC-MS/MS).

    PubMed

    Muratovic, Aida Zuberovic; Hagström, Thomas; Rosén, Johan; Granelli, Kristina; Hellenäs, Karl-Erik

    2015-09-11

    A method that uses mass spectrometry (MS) for identification and quantification of protein toxins, staphylococcal enterotoxins A and B (SEA and SEB), in milk and shrimp is described. The analysis was performed using a tryptic peptide, from each of the toxins, as the target analyte together with the corresponding (13)C-labeled synthetic internal standard peptide. The performance of the method was evaluated by analyzing spiked samples in the quantification range 2.5-30 ng/g (R² = 0.92-0.99). The limit of quantification (LOQ) in milk and the limit of detection (LOD) in shrimp was 2.5 ng/g, for both SEA and SEB toxins. The in-house reproducibility (RSD) was 8%-30% and 5%-41% at different concentrations for milk and shrimp, respectively. The method was compared to the ELISA method, used at the EU-RL (France), for milk samples spiked with SEA at low levels, in the quantification range of 2.5 to 5 ng/g. The comparison showed good coherence for the two methods: 2.9 (MS)/1.8 (ELISA) and 3.6 (MS)/3.8 (ELISA) ng/g. The major advantage of the developed method is that it allows direct confirmation of the molecular identity and quantitative analysis of SEA and SEB at low nanogram levels using a label and antibody free approach. Therefore, this method is an important step in the development of alternatives to the immune-assay tests currently used for staphylococcal enterotoxin analysis.

  19. Collagen Quantification in Tissue Specimens.

    PubMed

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  20. A quantitative witness for Greenberger-Horne-Zeilinger entanglement.

    PubMed

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.

  1. A quantitative witness for Greenberger-Horne-Zeilinger entanglement

    PubMed Central

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431

  2. Accurate quantification of fluorescent targets within turbid media based on a decoupled fluorescence Monte Carlo model.

    PubMed

    Deng, Yong; Luo, Zhaoyang; Jiang, Xu; Xie, Wenhao; Luo, Qingming

    2015-07-01

    We propose a method based on a decoupled fluorescence Monte Carlo model for constructing fluorescence Jacobians to enable accurate quantification of fluorescence targets within turbid media. The effectiveness of the proposed method is validated using two cylindrical phantoms enclosing fluorescent targets within homogeneous and heterogeneous background media. The results demonstrate that our method can recover relative concentrations of the fluorescent targets with higher accuracy than the perturbation fluorescence Monte Carlo method. This suggests that our method is suitable for quantitative fluorescence diffuse optical tomography, especially for in vivo imaging of fluorophore targets for diagnosis of different diseases and abnormalities.

  3. An UPLC-MS/MS method for separation and accurate quantification of tamoxifen and its metabolites isomers.

    PubMed

    Arellano, Cécile; Allal, Ben; Goubaa, Anwar; Roché, Henri; Chatelut, Etienne

    2014-11-01

    A selective and accurate analytical method is needed to quantify tamoxifen and its phase I metabolites in a prospective clinical protocol, for evaluation of pharmacokinetic parameters of tamoxifen and its metabolites in adjuvant treatment of breast cancer. The selectivity of the analytical method is a fundamental criteria to allow the quantification of the main active metabolites (Z)-isomers from (Z)'-isomers. An UPLC-MS/MS method was developed and validated for the quantification of (Z)-tamoxifen, (Z)-endoxifen, (E)-endoxifen, Z'-endoxifen, (Z)'-endoxifen, (Z)-4-hydroxytamoxifen, (Z)-4'-hydroxytamoxifen, N-desmethyl tamoxifen, and tamoxifen-N-oxide. The validation range was set between 0.5ng/mL and 125ng/mL for 4-hydroxytamoxifen and endoxifen isomers, and between 12.5ng/mL and 300ng/mL for tamoxifen, tamoxifen N-desmethyl and tamoxifen-N-oxide. The application to patient plasma samples was performed. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Multiple products monitoring as a robust approach for peptide quantification.

    PubMed

    Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee

    2009-07-01

    Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.

  5. Evaluation of the potential use of hybrid LC-MS/MS for active drug quantification applying the 'free analyte QC concept'.

    PubMed

    Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F

    2017-11-01

    Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.

  6. A Facile and Sensitive Method for Quantification of Cyclic Nucleotide Monophosphates in Mammalian Organs: Basal Levels of Eight cNMPs and Identification of 2',3'-cIMP

    PubMed Central

    Jia, Xin; Fontaine, Benjamin M.; Strobel, Fred; Weinert, Emily E.

    2014-01-01

    A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways. PMID:25513747

  7. Quantification of cardiolipin by liquid chromatography-electrospray ionization mass spectrometry.

    PubMed

    Garrett, Teresa A; Kordestani, Reza; Raetz, Christian R H

    2007-01-01

    Cardiolipin (CL), a tetra-acylated glycerophospholipid composed of two phosphatidyl moieties linked by a bridging glycerol, plays an important role in mitochondrial function in eukaryotic cells. Alterations to the content and acylation state of CL cause mitochondrial dysfunction and may be associated with pathologies such as ischemia, hypothyrodism, aging, and heart failure. The structure of CL is very complex because of microheterogeneity among its four acyl chains. Here we have developed a method for the quantification of CL molecular species by liquid chromatography-electrospray ionization mass spectrometry. We quantify the [M-2H](2-) ion of a CL of a given molecular formula and identify the CLs by their total number of carbons and unsaturations in the acyl chains. This method, developed using mouse macrophage RAW 264.7 tumor cells, is broadly applicable to other cell lines, tissues, bacteria and yeast. Furthermore, this method could be used for the quantification of lyso-CLs and bis-lyso-CLs.

  8. A facile and sensitive method for quantification of cyclic nucleotide monophosphates in mammalian organs: basal levels of eight cNMPs and identification of 2',3'-cIMP.

    PubMed

    Jia, Xin; Fontaine, Benjamin M; Strobel, Fred; Weinert, Emily E

    2014-12-12

    A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways.

  9. Rapid capillary electrophoresis approach for the quantification of ewe milk adulteration with cow milk.

    PubMed

    Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico

    2017-10-13

    The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Multicenter evaluation of stress-first myocardial perfusion image triage by nuclear technologists and automated quantification.

    PubMed

    Chaudhry, Waseem; Hussain, Nasir; Ahlberg, Alan W; Croft, Lori B; Fernandez, Antonio B; Parker, Mathew W; Swales, Heather H; Slomka, Piotr J; Henzlova, Milena J; Duvall, W Lane

    2017-06-01

    A stress-first myocardial perfusion imaging (MPI) protocol saves time, is cost effective, and decreases radiation exposure. A limitation of this protocol is the requirement for physician review of the stress images to determine the need for rest images. This hurdle could be eliminated if an experienced technologist and/or automated computer quantification could make this determination. Images from consecutive patients who were undergoing a stress-first MPI with attenuation correction at two tertiary care medical centers were prospectively reviewed independently by a technologist and cardiologist blinded to clinical and stress test data. Their decision on the need for rest imaging along with automated computer quantification of perfusion results was compared with the clinical reference standard of an assessment of perfusion images by a board-certified nuclear cardiologist that included clinical and stress test data. A total of 250 patients (mean age 61 years and 55% female) who underwent a stress-first MPI were studied. According to the clinical reference standard, 42 (16.8%) and 208 (83.2%) stress-first images were interpreted as "needing" and "not needing" rest images, respectively. The technologists correctly classified 229 (91.6%) stress-first images as either "needing" (n = 28) or "not needing" (n = 201) rest images. Their sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were 66.7%, 96.6%, 80.0%, and 93.5%, respectively. An automated stress TPD score ≥1.2 was associated with optimal sensitivity and specificity and correctly classified 179 (71.6%) stress-first images as either "needing" (n = 31) or "not needing" (n = 148) rest images. Its sensitivity, specificity, PPV, and NPV were 73.8%, 71.2%, 34.1%, and 93.1%, respectively. In a model whereby the computer or technologist could correct for the other's incorrect classification, 242 (96.8%) stress-first images were correctly classified. The composite sensitivity, specificity, PPV, and NPV were 83.3%, 99.5%, 97.2%, and 96.7%, respectively. Technologists and automated quantification software had a high degree of agreement with the clinical reference standard for determining the need for rest images in a stress-first imaging protocol. Utilizing an experienced technologist and automated systems to screen stress-first images could expand the use of stress-first MPI to sites where the cardiologist is not immediately available for interpretation.

  11. An automated multidimensional preparative gas chromatographic system for isolation and enrichment of trace amounts of xenon from ambient air.

    PubMed

    Larson, Tuula; Östman, Conny; Colmsjö, Anders

    2011-04-01

    The monitoring of radioactive xenon isotopes is one of the principal methods for the detection of nuclear explosions in order to identify clandestine nuclear testing. In this work, a miniaturized, multiple-oven, six-column, preparative gas chromatograph was constructed in order to isolate trace quantities of radioactive xenon isotopes from ambient air, utilizing nitrogen as the carrier gas. The multidimensional chromatograph comprised preparative stainless steel columns packed with molecular sieves, activated carbon, and synthetic carbon adsorbents (e.g., Anasorb®-747 and Carbosphere®). A combination of purification techniques--ambient adsorption, thermal desorption, back-flushing, thermal focusing, and heart cutting--was selectively optimized to produce a well-defined xenon peak that facilitated reproducible heart cutting and accurate quantification. The chromatographic purification of a sample requires approximately 4 h and provides complete separation of xenon from potentially interfering components (such as water vapor, methane, carbon dioxide, and radon) with recovery and accuracy close to 100%. The preparative enrichment process isolates and concentrates a highly purified xenon gas fraction that is suitable for subsequent ultra-low-level γ-, ß/γ-spectroscopic or high-resolution mass spectrometric measurement (e.g., to monitor the gaseous fission products of nuclear explosions at remote locations). The Xenon Processing Unit is a free-standing, relatively lightweight, and transportable system that can be interfaced to a variety of sampling and detection systems. It has a relatively inexpensive, rugged, and compact modular (19-inch rack) design that provides easy access to all parts for maintenance and has a low power requirement.

  12. Development and validation of stability indicating HPLC methods for related substances and assay analyses of amoxicillin and potassium clavulanate mixtures.

    PubMed

    Bellur Atici, Esen; Yazar, Yücel; Ağtaş, Çağan; Ridvanoğlu, Nurten; Karlığa, Bekir

    2017-03-20

    Antibacterial combinations consisting of the semisynthetic antibiotic amoxicillin (amox) and the β-lactamase inhibitor potassium clavulanate (clav) are commonly used and several chromatographic methods were reported for their quantification in mixtures. In the present work, single HPLC method for related substances analyses of amoxicillin and potassium clavulanate mixtures was developed and validated according to international conference on harmonization (ICH) guidelines. Eighteen amoxicillin and six potassium clavulanate impurities were successfully separated from each other by using triple gradient elution using a Thermo Hypersil Zorbax BDS C18 (250 mm×4.6mm, 3μm) column with 50μL injection volumes at a wavelength of 215nm. Commercially unavailable impurities were formed by degradation of amoxicillin and potassium clavulanate, identified by LC-MS studies and used during analytical method development and validation studies. Also, process related amoxicillin impurity-P was synthesized and characterized by using nuclear magnetic resonance (NMR) and mass spectroscopy (MS) for the first time. As complementary of this work, an assay method for amoxicillin and potassium clavulanate mixtures was developed and validated; stress-testing and stability studies of amox/clav mixtures was carried out under specified conditions according to ICH and analyzed by using validated stability-indicating assay and related substances methods. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Separation, identification and quantification of carotenoids and chlorophylls in dietary supplements containing Chlorella vulgaris and Spirulina platensis using High Performance Thin Layer Chromatography.

    PubMed

    Hynstova, Veronika; Sterbova, Dagmar; Klejdus, Borivoj; Hedbavny, Josef; Huska, Dalibor; Adam, Vojtech

    2018-01-30

    In this study, 14 commercial products (dietary supplements) containing alga Chlorella vulgaris and cyanobacteria Spirulina platensis, originated from China and Japan, were analysed. UV-vis spectrophotometric method was applied for rapid determination of chlorophylls, carotenoids and pheophytins; as degradation products of chlorophylls. High Performance Thin-Layer Chromatography (HPTLC) was used for effective separation of these compounds, and also Atomic Absorption Spectrometry for determination of heavy metals as indicator of environmental pollution. Based on the results obtained from UV-vis spectrophotometric determination of photosynthetic pigments (chlorophylls and carotenoids), it was confirmed that Chlorella vulgaris contains more of all these pigments compared to the cyanobacteria Spirulina platensis. The fastest mobility compound identified in Chlorella vulgaris and Spirulina platensis using HPTLC method was β-carotene. Spectral analysis and standard calibration curve method were used for identification and quantification of separated substances on Thin-Layer Chromatographic plate. Quantification of copper (Cu 2+ , at 324.7 nm) and zinc (Zn 2+ , at 213.9nm) was performed using Flame Atomic Absorption Spectrometry with air-acetylene flame atomization. Quantification of cadmium (Cd 2+ , at 228.8 nm), nickel (Ni 2+ , at 232.0nm) and lead (Pb 2+ , at 283.3nm) by Electrothermal Graphite Furnace Atomic Absorption Spectrometry; and quantification of mercury (Hg 2+ , at 254nm) by Cold Vapour Atomic Absorption Spectrometry. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Simultaneous quantification and semi-quantification of ginkgolic acids and their metabolites in rat plasma by UHPLC-LTQ-Orbitrap-MS and its application to pharmacokinetics study.

    PubMed

    Qian, Yiyun; Zhu, Zhenhua; Duan, Jin-Ao; Guo, Sheng; Shang, Erxin; Tao, Jinhua; Su, Shulan; Guo, Jianming

    2017-01-15

    A highly sensitive method using ultra-high-pressure liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap-MS) has been developed and validated for the simultaneous identification and quantification of ginkgolic acids and semi-quantification of their metabolites in rat plasma. For the five selected ginkgolic acids, the method was found to be with good linearities (r>0.9991), good intra- and inter-day precisions (RSD<15%), and good accuracies (RE, from -10.33% to 4.92%) as well. Extraction recoveries, matrix effects and stabilities for rat plasm samples were within the required limits. The validated method was successfully applied to investigate the pharmacokinetics of the five ginkgolic acids in rat plasma after oral administration of 3 dosage groups (900mg/kg, 300mg/kg and 100mg/kg). Meanwhile, six metabolites of GA (15:1) and GA (17:1) were identified by comparison of MS data with reported values. The results of validation in terms of linear ranges, precisions and stabilities were established for semi-quantification of metabolites. The curves of relative changes of these metabolites during the metabolic process were constructed by plotting the peak area ratios of metabolites to salicylic acid (internal standard, IS), respectively. Double peaks were observed in all 3 dose groups. Different type of metabolites and different dosage of each metabolite both resulted in different T max . Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Development and validation of a liquid chromatography isotope dilution mass spectrometry method for the reliable quantification of alkylphenols in environmental water samples by isotope pattern deconvolution.

    PubMed

    Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc

    2014-02-07

    We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Tutorial examples for uncertainty quantification methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  17. Multi-fidelity methods for uncertainty quantification in transport problems

    NASA Astrophysics Data System (ADS)

    Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.

    2016-12-01

    We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.

  18. A new dimethyl labeling-based SID-MRM-MS method and its application to three proteases involved in insulin maturation.

    PubMed

    Cheng, Dongwan; Zheng, Li; Hou, Junjie; Wang, Jifeng; Xue, Peng; Yang, Fuquan; Xu, Tao

    2015-01-01

    The absolute quantification of target proteins in proteomics involves stable isotope dilution coupled with multiple reactions monitoring mass spectrometry (SID-MRM-MS). The successful preparation of stable isotope-labeled internal standard peptides is an important prerequisite for the SID-MRM absolute quantification methods. Dimethyl labeling has been widely used in relative quantitative proteomics and it is fast, simple, reliable, cost-effective, and applicable to any protein sample, making it an ideal candidate method for the preparation of stable isotope-labeled internal standards. MRM mass spectrometry is of high sensitivity, specificity, and throughput characteristics and can quantify multiple proteins simultaneously, including low-abundance proteins in precious samples such as pancreatic islets. In this study, a new method for the absolute quantification of three proteases involved in insulin maturation, namely PC1/3, PC2 and CPE, was developed by coupling a stable isotope dimethyl labeling strategy for internal standard peptide preparation with SID-MRM-MS quantitative technology. This method offers a new and effective approach for deep understanding of the functional status of pancreatic β cells and pathogenesis in diabetes.

  19. Quantification of alginate by aggregation induced by calcium ions and fluorescent polycations.

    PubMed

    Zheng, Hewen; Korendovych, Ivan V; Luk, Yan-Yeung

    2016-01-01

    For quantification of polysaccharides, including heparins and alginates, the commonly used carbazole assay involves hydrolysis of the polysaccharide to form a mixture of UV-active dye conjugate products. Here, we describe two efficient detection and quantification methods that make use of the negative charges of the alginate polymer and do not involve degradation of the targeted polysaccharide. The first method utilizes calcium ions to induce formation of hydrogel-like aggregates with alginate polymer; the aggregates can be quantified readily by staining with a crystal violet dye. This method does not require purification of alginate from the culture medium and can measure the large amount of alginate that is produced by a mucoid Pseudomonas aeruginosa culture. The second method employs polycations tethering a fluorescent dye to form suspension aggregates with the alginate polyanion. Encasing the fluorescent dye in the aggregates provides an increased scattering intensity with a sensitivity comparable to that of the conventional carbazole assay. Both approaches provide efficient methods for monitoring alginate production by mucoid P. aeruginosa. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. A new method for the quantification of monosaccharides, uronic acids and oligosaccharides in partially hydrolyzed xylans by HPAEC-UV/VIS.

    PubMed

    Lorenz, Dominic; Erasmy, Nicole; Akil, Youssef; Saake, Bodo

    2016-04-20

    A new method for the chemical characterization of xylans is presented, to overcome the difficulties in quantification of 4-O-methyl-α-D-glucuronic acid (meGlcA). In this regard, the hydrolysis behavior of xylans from beech and birch wood was investigated to obtain the optimum conditions for hydrolysis, using sulfuric acid. Due to varying linkage strengths and degradation, no general method for complete hydrolysis can be designed. Therefore, partial hydrolysis was applied, yielding monosaccharides and small meGlcA containing oligosaccharides. For a new method by HPAEC-UV/VIS, these samples were reductively aminated by 2-aminobenzoic acid. By quantification of monosaccharides and oligosaccharides, as well as comparison with borate-HPAEC and (13)C NMR-spectroscopy, we revealed that the concentrations meGlcA are significantly underestimated compared to conventional methods. The detected concentrations are 85.4% (beech) and 76.3% (birch) higher with the new procedure. Furthermore, the quantified concentrations of xylose were 9.3% (beech) and 6.5% (birch) higher by considering the unhydrolyzed oligosaccharides as well. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Simultaneous Quantification of Syringic Acid and Kaempferol in Extracts of Bergenia Species Using Validated High-Performance Thin-Layer Chromatographic-Densitometric Method.

    PubMed

    Srivastava, Nishi; Srivastava, Amit; Srivastava, Sharad; Rawat, Ajay Kumar Singh; Khan, Abdul Rahman

    2016-03-01

    A rapid, sensitive, selective and robust quantitative densitometric high-performance thin-layer chromatographic method was developed and validated for separation and quantification of syringic acid (SYA) and kaempferol (KML) in the hydrolyzed extracts of Bergenia ciliata and Bergenia stracheyi. The separation was performed on silica gel 60F254 high-performance thin-layer chromatography plates using toluene : ethyl acetate : formic acid (5 : 4: 1, v/v/v) as the mobile phase. The quantification of SYA and KML was carried out using a densitometric reflection/absorption mode at 290 nm. A dense spot of SYA and KML appeared on the developed plate at a retention factor value of 0.61 ± 0.02 and 0.70 ± 0.01. A precise and accurate quantification was performed using linear regression analysis by plotting the peak area vs concentration 100-600 ng/band (correlation coefficient: r = 0.997, regression coefficient: R(2) = 0.996) for SYA and 100-600 ng/band (correlation coefficient: r = 0.995, regression coefficient: R(2) = 0.991) for KML. The developed method was validated in terms of accuracy, recovery and inter- and intraday study as per International Conference on Harmonisation guidelines. The limit of detection and limit of quantification of SYA and KML were determined, respectively, as 91.63, 142.26 and 277.67, 431.09 ng. The statistical data analysis showed that the method is reproducible and selective for the estimation of SYA and KML in extracts of B. ciliata and B. stracheyi. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Comparison of viable plate count, turbidity measurement and real-time PCR for quantification of Porphyromonas gingivalis.

    PubMed

    Clais, S; Boulet, G; Van Kerckhoven, M; Lanckacker, E; Delputte, P; Maes, L; Cos, P

    2015-01-01

    The viable plate count (VPC) is considered as the reference method for bacterial enumeration in periodontal microbiology but shows some important limitations for anaerobic bacteria. As anaerobes such as Porphyromonas gingivalis are difficult to culture, VPC becomes time-consuming and less sensitive. Hence, efficient normalization of experimental data to bacterial cell count requires alternative rapid and reliable quantification methods. This study compared the performance of VPC with that of turbidity measurement and real-time PCR (qPCR) in an experimental context using highly concentrated bacterial suspensions. Our TaqMan-based qPCR assay for P. gingivalis 16S rRNA proved to be sensitive and specific. Turbidity measurements offer a fast method to assess P. gingivalis growth, but suffer from high variability and a limited dynamic range. VPC was very time-consuming and less repeatable than qPCR. Our study concludes that qPCR provides the most rapid and precise approach for P. gingivalis quantification. Although our data were gathered in a specific research context, we believe that our conclusions on the inferior performance of VPC and turbidity measurements in comparison to qPCR can be extended to other research and clinical settings and even to other difficult-to-culture micro-organisms. Various clinical and research settings require fast and reliable quantification of bacterial suspensions. The viable plate count method (VPC) is generally seen as 'the gold standard' for bacterial enumeration. However, VPC-based quantification of anaerobes such as Porphyromonas gingivalis is time-consuming due to their stringent growth requirements and shows poor repeatability. Comparison of VPC, turbidity measurement and TaqMan-based qPCR demonstrated that qPCR possesses important advantages regarding speed, accuracy and repeatability. © 2014 The Society for Applied Microbiology.

  3. Development of an analytical method for the simultaneous analysis of MCPD esters and glycidyl esters in oil-based foodstuffs.

    PubMed

    Ermacora, Alessia; Hrnčiřík, Karel

    2014-01-01

    Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.

  4. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  5. Translational value of liquid chromatography coupled with tandem mass spectrometry-based quantitative proteomics for in vitro-in vivo extrapolation of drug metabolism and transport and considerations in selecting appropriate techniques.

    PubMed

    Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill

    2015-01-01

    Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.

  6. Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments

    USDA-ARS?s Scientific Manuscript database

    Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...

  7. IDENTIFICATION AND QUANTIFICATION OF AEROSOL POLAR OXYGENATED COMPOUNDS BEARING CARBOXYLIC AND/OR HYDROXYL GROUPS. 1. METHOD DEVELOPMENT

    EPA Science Inventory

    In this study, a new analytical technique was developed for the identification and quantification of multi-functional compounds containing simultaneously at least one hydroxyl or one carboxylic group, or both. This technique is based on derivatizing first the carboxylic group(s) ...

  8. Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students

    ERIC Educational Resources Information Center

    Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.

    2014-01-01

    A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…

  9. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    USDA-ARS?s Scientific Manuscript database

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  10. Non-destructive detection and quantification of blueberry bruising using near-infrared (NIR) hyperspectral reflectance imaging

    USDA-ARS?s Scientific Manuscript database

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive and time-consuming. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. The spe...

  11. Quantification of micro stickies

    Treesearch

    Mahendra Doshi; Jeffrey Dyer; Salman Aziz; Kristine Jackson; Said M. Abubakr

    1997-01-01

    The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...

  12. Novel primers and PCR protocols for the specific detection and quantification of Sphingobium suberifaciens in situ

    USDA-ARS?s Scientific Manuscript database

    The pathogen causing corky root on lettuce, Sphingobium suberifaciens, is recalcitrant to standard epidemiological methods. Primers were selected from 16S rDNA sequences useful for the specific detection and quantification of S. suberifaciens. Conventional (PCR) and quantitative (qPCR) PCR protocols...

  13. Hampton Roads climate impact quantification initiative : baseline assessment of the transportation assets & overview of economic analyses useful in quantifying impacts

    DOT National Transportation Integrated Search

    2016-09-13

    The Hampton Roads Climate Impact Quantification Initiative (HRCIQI) is a multi-part study sponsored by the U.S. Department of Transportation (DOT) Climate Change Center with the goals that include developing a cost tool that provides methods for volu...

  14. Simultaneous quantification of cholesterol sulfate, androgen sulfates, and progestagen sulfates in human serum by LC-MS/MS.

    PubMed

    Sánchez-Guijo, Alberto; Oji, Vinzenz; Hartmann, Michaela F; Traupe, Heiko; Wudy, Stefan A

    2015-09-01

    Steroids are primarily present in human fluids in their sulfated forms. Profiling of these compounds is important from both diagnostic and physiological points of view. Here, we present a novel method for the quantification of 11 intact steroid sulfates in human serum by LC-MS/MS. The compounds analyzed in our method, some of which are quantified for the first time in blood, include cholesterol sulfate, pregnenolone sulfate, 17-hydroxy-pregnenolone sulfate, 16-α-hydroxy-dehydroepiandrosterone sulfate, dehydroepiandrosterone sulfate, androstenediol sulfate, androsterone sulfate, epiandrosterone sulfate, testosterone sulfate, epitestosterone sulfate, and dihydrotestosterone sulfate. The assay was conceived to quantify sulfated steroids in a broad range of concentrations, requiring only 300 μl of serum. The method has been validated and its performance was studied at three quality controls, selected for each compound according to its physiological concentration. The assay showed good linearity (R(2) > 0.99) and recovery for all the compounds, with limits of quantification ranging between 1 and 80 ng/ml. Averaged intra-day and between-day precisions (coefficient of variation) and accuracies (relative errors) were below 10%. The method has been successfully applied to study the sulfated steroidome in diseases such as steroid sulfatase deficiency, proving its diagnostic value. This is, to our best knowledge, the most comprehensive method available for the quantification of sulfated steroids in human blood. Copyright © 2015 by the American Society for Biochemistry and Molecular Biology, Inc.

  15. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients

    PubMed Central

    Ramírez, Juan Carlos; Cura, Carolina Inés; Moreira, Otacilio da Cruz; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Guedes, Paulo Marcos da Matta; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Galvão, Lúcia Maria da Cunha; da Câmara, Antonia Cláudia Jácome; Espinoza, Bertha; de Noya, Belkisyole Alarcón; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G.

    2015-01-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. PMID:26320872

  16. Nondestructive Analysis of Tumor-Associated Membrane Protein Integrating Imaging and Amplified Detection in situ Based on Dual-Labeled DNAzyme.

    PubMed

    Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi

    2018-01-01

    Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.

  17. Development of a targeted method for twenty-three metabolites related to polyphenol gut microbial metabolism in biological samples, using SPE and UHPLC-ESI-MS/MS.

    PubMed

    Gasperotti, Mattia; Masuero, Domenico; Guella, Graziano; Mattivi, Fulvio; Vrhovsek, Urska

    2014-10-01

    An increasing number of studies have concerned the profiling of polyphenol microbial metabolites, especially in urine or plasma, but only a few have regarded their accurate quantification. This study reports on a new ultra-performance liquid chromatography tandem mass spectrometry method with electrospray ionisation (UHPLC-ESI-MS/MS) using a simple clean-up step with solid phase extraction (SPE) and validation on different biological matrices. The method was tested with spiked samples of liver, heart, kidneys, brain, blood and urine. The purification procedure, after the evaluation of three different cartridges, makes it possible to obtain cleaner samples and better quantification of putative trace metabolites, especially related to dietary studies, with concentrations below ng/g in tissue and for urine and blood, starting from ng/ml. Limits of detection and linear range were also assessed using mixed polyphenol metabolite standards. Short chromatographic separation was carried out for 23 target compounds related to the polyphenol microbial metabolism, coupled with a triple quadrupole mass spectrometer for their accurate quantification. By analysing different spiked biological samples we were able to test metabolite detection in the matrix and validate the overall recovery of the method, from purification to quantification. The method developed can be successfully applied and is suitable for high-throughput targeted metabolomics analysis related to nutritional intervention, or the study of the metabolic mechanism in response to a polyphenol-rich diet. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay

    PubMed Central

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997

  19. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  20. Supercritical fluid chromatography coupled with tandem mass spectrometry: A high-efficiency detection technique to quantify Taxane drugs in whole-blood samples.

    PubMed

    Jin, Chan; Guan, Jibin; Zhang, Dong; Li, Bing; Liu, Hongzhuo; He, Zhonggui

    2017-10-01

    We present a technique to rapid determine taxane in blood samples by supercritical fluid chromatography together with mass spectrometry. The aim of this study was to develop a supercritical fluid chromatography with mass spectrometry method for the analysis of paclitaxel, cabazitaxel, and docetaxel in whole-blood samples of rats. Liquid-dry matrix spot extraction was selected in sample preparation procedure. Supercritical fluid chromatography separation of paclitaxel, cabazitaxel, docetaxel, and glyburide (internal standard) was accomplished within 3 min by using the gradient mobile phase consisted of methanol as the compensation solvent and carbon dioxide at a flow rate of 1.0 mL/min. The method was validated regarding specificity, the lower limit of quantification, repeatability, and reproducibility of quantification, extraction recovery, and matrix effects. The lower limit of quantification was found to be 10 ng/mL since it exhibited acceptable precision and accuracy at the corresponding level. All interday accuracies and precisions were within the accepted criteria of ±15% of the nominal value and within ±20% at the lower limit of quantification, implying that the method was reliable and reproducible. In conclusion, this method is a promising tool to support and improve preclinical or clinical pharmacokinetic studies with the taxanes anticancer drugs. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Top