Sample records for quantification tool applied

  1. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  2. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    NASA Astrophysics Data System (ADS)

    Seebauer, Matthias

    2014-03-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.

  3. A novel tool for specific detection and quantification of chicken/turkey parvoviruses to trace poultry fecal contamination in the environment.

    PubMed

    Carratalà, Anna; Rusinol, Marta; Hundesa, Ayalkibet; Biarnes, Mar; Rodriguez-Manzano, Jesus; Vantarakis, Apostolos; Kern, Anita; Suñen, Ester; Girones, Rosina; Bofill-Mas, Sílvia

    2012-10-01

    Poultry farming may introduce pathogens into the environment and food chains. High concentrations of chicken/turkey parvoviruses were detected in chicken stools and slaughterhouse and downstream urban wastewaters by applying new PCR-based specific detection and quantification techniques. Our results confirm that chicken/turkey parvoviruses may be useful viral indicators of poultry fecal contamination.

  4. A Novel Tool for Specific Detection and Quantification of Chicken/Turkey Parvoviruses To Trace Poultry Fecal Contamination in the Environment

    PubMed Central

    Carratalà, Anna; Rusinol, Marta; Hundesa, Ayalkibet; Biarnes, Mar; Rodriguez-Manzano, Jesus; Vantarakis, Apostolos; Kern, Anita; Suñen, Ester; Bofill-Mas, Sílvia

    2012-01-01

    Poultry farming may introduce pathogens into the environment and food chains. High concentrations of chicken/turkey parvoviruses were detected in chicken stools and slaughterhouse and downstream urban wastewaters by applying new PCR-based specific detection and quantification techniques. Our results confirm that chicken/turkey parvoviruses may be useful viral indicators of poultry fecal contamination. PMID:22904047

  5. Quantification of protein expression in cells and cellular subcompartments on immunohistochemical sections using a computer supported image analysis system.

    PubMed

    Braun, Martin; Kirsten, Robert; Rupp, Niels J; Moch, Holger; Fend, Falko; Wernert, Nicolas; Kristiansen, Glen; Perner, Sven

    2013-05-01

    Quantification of protein expression based on immunohistochemistry (IHC) is an important step for translational research and clinical routine. Several manual ('eyeballing') scoring systems are used in order to semi-quantify protein expression based on chromogenic intensities and distribution patterns. However, manual scoring systems are time-consuming and subject to significant intra- and interobserver variability. The aim of our study was to explore, whether new image analysis software proves to be sufficient as an alternative tool to quantify protein expression. For IHC experiments, one nucleus specific marker (i.e., ERG antibody), one cytoplasmic specific marker (i.e., SLC45A3 antibody), and one marker expressed in both compartments (i.e., TMPRSS2 antibody) were chosen. Stainings were applied on TMAs, containing tumor material of 630 prostate cancer patients. A pathologist visually quantified all IHC stainings in a blinded manner, applying a four-step scoring system. For digital quantification, image analysis software (Tissue Studio v.2.1, Definiens AG, Munich, Germany) was applied to obtain a continuous spectrum of average staining intensity. For each of the three antibodies we found a strong correlation of the manual protein expression score and the score of the image analysis software. Spearman's rank correlation coefficient was 0.94, 0.92, and 0.90 for ERG, SLC45A3, and TMPRSS2, respectively (p⟨0.01). Our data suggest that the image analysis software Tissue Studio is a powerful tool for quantification of protein expression in IHC stainings. Further, since the digital analysis is precise and reproducible, computer supported protein quantification might help to overcome intra- and interobserver variability and increase objectivity of IHC based protein assessment.

  6. Estimating phosphorus loss in runoff from manure and fertilizer for a phosphorus loss quantification tool.

    PubMed

    Vadas, P A; Good, L W; Moore, P A; Widman, N

    2009-01-01

    Nonpoint-source pollution of fresh waters by P is a concern because it contributes to accelerated eutrophication. Given the state of the science concerning agricultural P transport, a simple tool to quantify annual, field-scale P loss is a realistic goal. We developed new methods to predict annual dissolved P loss in runoff from surface-applied manures and fertilizers and validated the methods with data from 21 published field studies. We incorporated these manure and fertilizer P runoff loss methods into an annual, field-scale P loss quantification tool that estimates dissolved and particulate P loss in runoff from soil, manure, fertilizer, and eroded sediment. We validated the P loss tool using independent data from 28 studies that monitored P loss in runoff from a variety of agricultural land uses for at least 1 yr. Results demonstrated (i) that our new methods to estimate P loss from surface manure and fertilizer are an improvement over methods used in existing Indexes, and (ii) that it was possible to reliably quantify annual dissolved, sediment, and total P loss in runoff using relatively simple methods and readily available inputs. Thus, a P loss quantification tool that does not require greater degrees of complexity or input data than existing P Indexes could accurately predict P loss across a variety of management and fertilization practices, soil types, climates, and geographic locations. However, estimates of runoff and erosion are still needed that are accurate to a level appropriate for the intended use of the quantification tool.

  7. Nonlinear microscopy as diagnostic tool for the discrimination of activated T cells

    NASA Astrophysics Data System (ADS)

    Gavgiotaki, E.; Filippidis, G.; Zerva, I.; Agelaki, S.; Georgoulias, V.; Athanassakis, I.

    2017-07-01

    Third Harmonic Generation (THG) imaging was applied to mouse resting and activated T-cells. Quantification of THG signal, which corresponded to lipid droplets, could distinguish activated Tcells, allowing follow-up of immune response development.

  8. Usefulness of real-time PCR as a complementary tool to the monitoring of Legionella spp. and Legionella pneumophila by culture in industrial cooling systems.

    PubMed

    Touron-Bodilis, A; Pougnard, C; Frenkiel-Lebossé, H; Hallier-Soulier, S

    2011-08-01

    This study was designed to evaluate the usefulness of quantification by real-time PCR as a management tool to monitor concentrations of Legionella spp. and Legionella pneumophila in industrial cooling systems and its ability to anticipate culture trends by the French standard method (AFNOR T90-431). Quantifications of Legionella bacteria were achieved by both methods on samples from nine cooling systems with different water qualities. Proportion of positive samples for L. pneumophila quantified by PCR was clearly lower in deionized or river waters submitted to a biocide treatment than in raw river waters, while positive samples for Legionella spp. were quantified for almost all the samples. For some samples containing PCR inhibitors, high quantification limits (up to 4·80 × 10(5) GU l(-1) ) did not allow us to quantify L. pneumophila, when they were quantified by culture. Finally, the monitoring of concentrations of L. pneumophila by both methods showed similar trends for 57-100% of the samples. These results suggest that, if some methodological steps designed to reduce inhibitory problems and thus decrease the quantification limits, could be developed to quantify Legionella in complex waters, the real-time PCR could be a valuable complementary tool to monitor the evolution of L. pneumophila concentrations. This study shows the possibility of using real-time PCR to monitor L. pneumophila proliferations in cooling systems and the importance to adapt nucleic acid extraction and purification protocols to raw waters. Journal of Applied Microbiology © 2011 The Society for Applied Microbiology. No claim to French Government works.

  9. Evaluating Kinase ATP Uptake and Tyrosine Phosphorylation using Multiplexed Quantification of Chemically Labeled and Post-Translationally Modified Peptides

    PubMed Central

    Fang, Bin; Hoffman, Melissa A.; Mirza, Abu-Sayeef; Mishall, Katie M.; Li, Jiannong; Peterman, Scott M.; Smalley, Keiran S. M.; Shain, Kenneth H.; Weinberger, Paul M.; Wu, Jie; Rix, Uwe; Haura, Eric B.; Koomen, John M.

    2015-01-01

    Cancer biologists and other healthcare researchers face an increasing challenge in addressing the molecular complexity of disease. Biomarker measurement tools and techniques now contribute to both basic science and translational research. In particular, liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) for multiplexed measurements of protein biomarkers has emerged as a versatile tool for systems biology. Assays can be developed for specific peptides that report on protein expression, mutation, or post-translational modification; discovery proteomics data rapidly translated into multiplexed quantitative approaches. Complementary advances in affinity purification enrich classes of enzymes or peptides representing post-translationally modified or chemically labeled substrates. Here, we illustrate the process for the relative quantification of hundreds of peptides in a single LC-MRM experiment. Desthiobiotinylated peptides produced by activity-based protein profiling (ABPP) using ATP probes and tyrosine-phosphorylated peptides are used as examples. These targeted quantification panels can be applied to further understand the biology of human disease. PMID:25782629

  10. A comparative analysis of modified binders : original asphalts and materials extracted from existing pavements : technical summary.

    DOT National Transportation Integrated Search

    2010-01-01

    The initial objective of this research was to develop procedures and standards for applying GPC as an analytical tool to define the percentage amounts of polymer modifiers in polymer modified asphalt cements soluble in eluting GPC solvents. Quantific...

  11. A STATISTICAL MODELING METHODOLOGY FOR THE DETECTION, QUANTIFICATION, AND PREDICTION OF ECOLOGICAL THRESHOLDS

    EPA Science Inventory

    This study will provide a general methodology for integrating threshold information from multiple species ecological metrics, allow for prediction of changes of alternative stable states, and provide a risk assessment tool that can be applied to adaptive management. The integr...

  12. Method and apparatus for characterizing and enhancing the functional performance of machine tools

    DOEpatents

    Barkman, William E; Babelay, Jr., Edwin F; Smith, Kevin Scott; Assaid, Thomas S; McFarland, Justin T; Tursky, David A; Woody, Bethany; Adams, David

    2013-04-30

    Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include workpiece surface finish, and the ability to generate chips of the desired length.

  13. Method and apparatus for characterizing and enhancing the dynamic performance of machine tools

    DOEpatents

    Barkman, William E; Babelay, Jr., Edwin F

    2013-12-17

    Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include dynamic one axis positional accuracy of the machine tool, dynamic cross-axis stability of the machine tool, and dynamic multi-axis positional accuracy of the machine tool.

  14. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    PubMed

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  16. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics1[OPEN

    PubMed Central

    Poeschl, Yvonne; Plötner, Romina

    2017-01-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. PMID:28931626

  17. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    PubMed

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Ultrahigh-performance liquid chromatography/electrospray ionization linear ion trap Orbitrap mass spectrometry of antioxidants (amines and phenols) applied in lubricant engineering.

    PubMed

    Kassler, Alexander; Pittenauer, Ernst; Doerr, Nicole; Allmaier, Guenter

    2014-01-15

    For the qualification and quantification of antioxidants (aromatic amines and sterically hindered phenols), most of them applied as lubricant additives, two ultrahigh-performance liquid chromatography (UHPLC) electrospray ionization mass spectrometric methods applying the positive and negative ion mode have been developed for lubricant design and engineering thus allowing e.g. the study of the degradation of lubricants. Based on the different chemical properties of the two groups of antioxidants, two methods offering a fast separation (10 min) without prior derivatization were developed. In order to reach these requirements, UHPLC was coupled with an LTQ Orbitrap hybrid tandem mass spectrometer with positive and negative ion electrospray ionization for simultaneous detection of spectra from UHPLC-high-resolution (HR)-MS (full scan mode) and UHPLC-low-resolution linear ion trap MS(2) (LITMS(2)), which we term UHPLC/HRMS-LITMS(2). All 20 analytes investigated could be qualified by an UHPLC/HRMS-LITMS(2) approach consisting of simultaneous UHPLC/HRMS (elemental composition) and UHPLC/LITMS(2) (diagnostic product ions) according to EC guidelines. Quantification was based on an UHPLC/LITMS(2) approach due to increased sensitivity and selectivity compared to UHPLC/HRMS. Absolute quantification was only feasible for seven analytes with well-specified purity of references whereas relative quantification was obtainable for another nine antioxidants. All of them showed good standard deviation and repeatability. The combined methods allow qualitative and quantitative determination of a wide variety of different antioxidants including aminic/phenolic compounds applied in lubricant engineering. These data show that the developed methods will be versatile tools for further research on identification and characterization of the thermo-oxidative degradation products of antioxidants in lubricants. Copyright © 2013 John Wiley & Sons, Ltd.

  19. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.

    PubMed

    Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina

    2017-11-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.

  20. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Characterising non-linear dynamics in nocturnal breathing patterns of healthy infants using recurrence quantification analysis.

    PubMed

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2013-05-01

    Breathing dynamics vary between infant sleep states, and are likely to exhibit non-linear behaviour. This study applied the non-linear analytical tool recurrence quantification analysis (RQA) to 400 breath interval periods of REM and N-REM sleep, and then using an overlapping moving window. The RQA variables were different between sleep states, with REM radius 150% greater than N-REM radius, and REM laminarity 79% greater than N-REM laminarity. RQA allowed the observation of temporal variations in non-linear breathing dynamics across a night's sleep at 30s resolution, and provides a basis for quantifying changes in complex breathing dynamics with physiology and pathology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. The use of self-quantification systems for personal health information: big data management activities and prospects.

    PubMed

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals).

  4. The use of self-quantification systems for personal health information: big data management activities and prospects

    PubMed Central

    2015-01-01

    Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals). PMID:26019809

  5. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    NASA Astrophysics Data System (ADS)

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  6. Application of Photoshop and Scion Image analysis to quantification of signals in histochemistry, immunocytochemistry and hybridocytochemistry.

    PubMed

    Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva

    2006-02-01

    To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.

  7. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations.

    PubMed

    Pinho, Ludmila A G; Sá-Barreto, Lívia C L; Infante, Carlos M C; Cunha-Filho, Marcílio S S

    2016-04-15

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form. Copyright © 2016. Published by Elsevier B.V.

  8. EEG-based "serious" games and monitoring tools for pain management.

    PubMed

    Sourina, Olga; Wang, Qiang; Nguyen, Minh Khoa

    2011-01-01

    EEG-based "serious games" for medical applications attracted recently more attention from the research community and industry as wireless EEG reading devices became easily available on the market. EEG-based technology has been applied in anesthesiology, psychology, etc. In this paper, we proposed and developed EEG-based "serious" games and doctor's monitoring tools that could be used for pain management. As EEG signal is considered to have a fractal nature, we proposed and develop a novel spatio-temporal fractal based algorithm for brain state quantification. The algorithm is implemented with blobby visualization tools for patient monitoring and in EEG-based "serious" games. Such games could be used by patient even at home convenience for pain management as an alternative to traditional drug treatment.

  9. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  10. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  11. Speckle tracking analysis: a new tool for left atrial function analysis in systemic hypertension: an overview.

    PubMed

    Cameli, Matteo; Ciccone, Marco M; Maiello, Maria; Modesti, Pietro A; Muiesan, Maria L; Scicchitano, Pietro; Novo, Salvatore; Palmiero, Pasquale; Saba, Pier S; Pedrinelli, Roberto

    2016-05-01

    Speckle tracking echocardiography (STE) is an imaging technique applied to the analysis of left atrial function. STE provides a non-Doppler, angle-independent and objective quantification of left atrial myocardial deformation. Data regarding feasibility, accuracy and clinical applications of left atrial strain are rapidly gathering. This review describes the fundamental concepts of left atrial STE, illustrates its pathophysiological background and discusses its emerging role in systemic arterial hypertension.

  12. multiUQ: An intrusive uncertainty quantification tool for gas-liquid multiphase flows

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2017-11-01

    Uncertainty quantification (UQ) can improve our understanding of the sensitivity of gas-liquid multiphase flows to variability about inflow conditions and fluid properties, creating a valuable tool for engineers. While non-intrusive UQ methods (e.g., Monte Carlo) are simple and robust, the cost associated with these techniques can render them unrealistic. In contrast, intrusive UQ techniques modify the governing equations by replacing deterministic variables with stochastic variables, adding complexity, but making UQ cost effective. Our numerical framework, called multiUQ, introduces an intrusive UQ approach for gas-liquid flows, leveraging a polynomial chaos expansion of the stochastic variables: density, momentum, pressure, viscosity, and surface tension. The gas-liquid interface is captured using a conservative level set approach, including a modified reinitialization equation which is robust and quadrature free. A least-squares method is leveraged to compute the stochastic interface normal and curvature needed in the continuum surface force method for surface tension. The solver is tested by applying uncertainty to one or two variables and verifying results against the Monte Carlo approach. NSF Grant #1511325.

  13. Qualitative and Quantitative Control of Carbonated Cola Beverages Using 1H NMR Spectroscopy

    PubMed Central

    2012-01-01

    1H Nuclear magnetic resonance (NMR) spectroscopy (400 MHz) was used in the context of food surveillance to develop a reliable analytical tool to differentiate brands of cola beverages and to quantify selected constituents of the soft drinks. The preparation of the samples required only degassing and addition of 0.1% of TSP in D2O for locking and referencing followed by adjustment of pH to 4.5. The NMR spectra obtained can be considered as “fingerprints” and were analyzed by principal component analysis (PCA). Clusters from colas of the same brand were observed, and significant differences between premium and discount brands were found. The quantification of caffeine, acesulfame-K, aspartame, cyclamate, benzoate, hydroxymethylfurfural (HMF), sulfite ammonia caramel (E 150D), and vanillin was simultaneously possible using external calibration curves and applying TSP as internal standard. Limits of detection for caffeine, aspartame, acesulfame-K, and benzoate were 1.7, 3.5, 0.8, and 1.0 mg/L, respectively. Hence, NMR spectroscopy combined with chemometrics is an efficient tool for simultaneous identification of soft drinks and quantification of selected constituents. PMID:22356160

  14. Qualitative and quantitative control of carbonated cola beverages using ¹H NMR spectroscopy.

    PubMed

    Maes, Pauline; Monakhova, Yulia B; Kuballa, Thomas; Reusch, Helmut; Lachenmeier, Dirk W

    2012-03-21

    ¹H Nuclear magnetic resonance (NMR) spectroscopy (400 MHz) was used in the context of food surveillance to develop a reliable analytical tool to differentiate brands of cola beverages and to quantify selected constituents of the soft drinks. The preparation of the samples required only degassing and addition of 0.1% of TSP in D₂O for locking and referencing followed by adjustment of pH to 4.5. The NMR spectra obtained can be considered as "fingerprints" and were analyzed by principal component analysis (PCA). Clusters from colas of the same brand were observed, and significant differences between premium and discount brands were found. The quantification of caffeine, acesulfame-K, aspartame, cyclamate, benzoate, hydroxymethylfurfural (HMF), sulfite ammonia caramel (E 150D), and vanillin was simultaneously possible using external calibration curves and applying TSP as internal standard. Limits of detection for caffeine, aspartame, acesulfame-K, and benzoate were 1.7, 3.5, 0.8, and 1.0 mg/L, respectively. Hence, NMR spectroscopy combined with chemometrics is an efficient tool for simultaneous identification of soft drinks and quantification of selected constituents.

  15. Application of recurrence quantification analysis to automatically estimate infant sleep states using a single channel of respiratory data.

    PubMed

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2012-08-01

    Previous work has identified that non-linear variables calculated from respiratory data vary between sleep states, and that variables derived from the non-linear analytical tool recurrence quantification analysis (RQA) are accurate infant sleep state discriminators. This study aims to apply these discriminators to automatically classify 30 s epochs of infant sleep as REM, non-REM and wake. Polysomnograms were obtained from 25 healthy infants at 2 weeks, 3, 6 and 12 months of age, and manually sleep staged as wake, REM and non-REM. Inter-breath interval data were extracted from the respiratory inductive plethysmograph, and RQA applied to calculate radius, determinism and laminarity. Time-series statistic and spectral analysis variables were also calculated. A nested cross-validation method was used to identify the optimal feature subset, and to train and evaluate a linear discriminant analysis-based classifier. The RQA features radius and laminarity and were reliably selected. Mean agreement was 79.7, 84.9, 84.0 and 79.2 % at 2 weeks, 3, 6 and 12 months, and the classifier performed better than a comparison classifier not including RQA variables. The performance of this sleep-staging tool compares favourably with inter-human agreement rates, and improves upon previous systems using only respiratory data. Applications include diagnostic screening and population-based sleep research.

  16. The effect of applied transducer force on acoustic radiation force impulse quantification within the left lobe of the liver.

    PubMed

    Porra, Luke; Swan, Hans; Ho, Chien

    2015-08-01

    Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.

  17. MATLAB tools for improved characterization and quantification of volcanic incandescence in Webcam imagery; applications at Kilauea Volcano, Hawai'i

    USGS Publications Warehouse

    Patrick, Matthew R.; Kauahikaua, James P.; Antolik, Loren

    2010-01-01

    Webcams are now standard tools for volcano monitoring and are used at observatories in Alaska, the Cascades, Kamchatka, Hawai'i, Italy, and Japan, among other locations. Webcam images allow invaluable documentation of activity and provide a powerful comparative tool for interpreting other monitoring datastreams, such as seismicity and deformation. Automated image processing can improve the time efficiency and rigor of Webcam image interpretation, and potentially extract more information on eruptive activity. For instance, Lovick and others (2008) provided a suite of processing tools that performed such tasks as noise reduction, eliminating uninteresting images from an image collection, and detecting incandescence, with an application to dome activity at Mount St. Helens during 2007. In this paper, we present two very simple automated approaches for improved characterization and quantification of volcanic incandescence in Webcam images at Kilauea Volcano, Hawai`i. The techniques are implemented in MATLAB (version 2009b, Copyright: The Mathworks, Inc.) to take advantage of the ease of matrix operations. Incandescence is a useful indictor of the location and extent of active lava flows and also a potentially powerful proxy for activity levels at open vents. We apply our techniques to a period covering both summit and east rift zone activity at Kilauea during 2008?2009 and compare the results to complementary datasets (seismicity, tilt) to demonstrate their integrative potential. A great strength of this study is the demonstrated success of these tools in an operational setting at the Hawaiian Volcano Observatory (HVO) over the course of more than a year. Although applied only to Webcam images here, the techniques could be applied to any type of sequential images, such as time-lapse photography. We expect that these tools are applicable to many other volcano monitoring scenarios, and the two MATLAB scripts, as they are implemented at HVO, are included in the appendixes. These scripts would require minor to moderate modifications for use elsewhere, primarily to customize directory navigation. If the user has some familiarity with MATLAB, or programming in general, these modifications should be easy. Although we originally anticipated needing the Image Processing Toolbox, the scripts in the appendixes do not require it. Thus, only the base installation of MATLAB is needed. Because fairly basic MATLAB functions are used, we expect that the script can be run successfully by versions earlier than 2009b.

  18. A Simple Graphical Method for Quantification of Disaster Management Surge Capacity Using Computer Simulation and Process-control Tools.

    PubMed

    Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco

    2015-02-01

    Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.

  19. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE PAGES

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain; ...

    2017-09-23

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  20. Testing the robustness of optimal access vessel fleet selection for operation and maintenance of offshore wind farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sperstad, Iver Bakken; Stålhane, Magnus; Dinwoodie, Iain

    Optimising the operation and maintenance (O&M) and logistics strategy of offshore wind farms implies the decision problem of selecting the vessel fleet for O&M. Different strategic decision support tools can be applied to this problem, but much uncertainty remains regarding both input data and modelling assumptions. Our paper aims to investigate and ultimately reduce this uncertainty by comparing four simulation tools, one mathematical optimisation tool and one analytic spreadsheet-based tool applied to select the O&M access vessel fleet that minimizes the total O&M cost of a reference wind farm. The comparison shows that the tools generally agree on the optimalmore » vessel fleet, but only partially agree on the relative ranking of the different vessel fleets in terms of total O&M cost. The robustness of the vessel fleet selection to various input data assumptions was tested, and the ranking was found to be particularly sensitive to the vessels' limiting significant wave height for turbine access. Also the parameter with the greatest discrepancy between the tools, implies that accurate quantification and modelling of this parameter is crucial. The ranking is moderately sensitive to turbine failure rates and vessel day rates but less sensitive to electricity price and vessel transit speed.« less

  1. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  2. Pore network quantification of sandstones under experimental CO2 injection using image analysis

    NASA Astrophysics Data System (ADS)

    Berrezueta, Edgar; González-Menéndez, Luís; Ordóñez-Casado, Berta; Olaya, Peter

    2015-04-01

    Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. Our case study is focused on the application of these techniques to study the evolution of rock pore network subjected to super critical CO2-injection. We have proposed a Digital Image Analysis (DIA) protocol that guarantees measurement reproducibility and reliability. This can be summarized in the following stages: (i) detailed description of mineralogy and texture (before and after CO2-injection) by optical and scanning electron microscopy (SEM) techniques using thin sections; (ii) adjustment and calibration of DIA tools; (iii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers); (iv) study and quantification by DIA that allow (a) identification and isolation of pixels that belong to the same category: minerals vs. pores in each sample and (b) measurement of changes in pore network, after the samples have been exposed to new conditions (in our case: SC-CO2-injection). Finally, interpretation of the petrography and the measured data by an automated approach were done. In our applied study, the DIA results highlight the changes observed by SEM and microscopic techniques, which consisted in a porosity increase when CO2 treatment occurs. Other additional changes were minor: variations in the roughness and roundness of pore edges, and pore aspect ratio, shown in the bigger pore population. Additionally, statistic tests of pore parameters measured were applied to verify that the differences observed between samples before and after CO2-injection were significant.

  3. Quantification of scaling exponent with Crossover type phenomena for different types of forcing in DC glow discharge plasma

    NASA Astrophysics Data System (ADS)

    Saha, Debajyoti; Shaw, Pankaj Kumar; Ghosh, Sabuj; Janaki, M. S.; Sekar Iyengar, A. N.

    2018-01-01

    We have carried out a detailed study of scaling region using detrended fractal analysis test by applying different forcing likewise noise, sinusoidal, square on the floating potential fluctuations acquired under different pressures in a DC glow discharge plasma. The transition in the dynamics is observed through recurrence plot techniques which is an efficient method to observe the critical regime transitions in dynamics. The complexity of the nonlinear fluctuation has been revealed with the help of recurrence quantification analysis which is a suitable tool for investigating recurrence, an ubiquitous feature providing a deep insight into the dynamics of real dynamical system. An informal test for stationarity which checks for the compatibility of nonlinear approximations to the dynamics made in different segments in a time series has been proposed. In case of sinusoidal, noise, square forcing applied on fluctuation acquired at P = 0.12 mbar only one dominant scaling region is observed whereas the forcing applied on fluctuation (P = 0.04 mbar) two prominent scaling regions have been explored reliably using different forcing amplitudes indicating the signature of crossover phenomena. Furthermore a persistence long range behavior has been observed in one of these scaling regions. A comprehensive study of the quantification of scaling exponents has been carried out with the increase in amplitude and frequency of sinusoidal, square type of forcings. The scalings exponent is envisaged to be the roughness of the time series. The method provides a single quantitative idea of the scaling exponent to quantify the correlation properties of a signal.

  4. Mass spectrometry in systems biology an introduction.

    PubMed

    Dunn, Warwick B

    2011-01-01

    The qualitative detection, quantification, and structural characterization of analytes in biological systems are important requirements for objectives to be fulfilled in systems biology research. One analytical tool applied to a multitude of systems biology studies is mass spectrometry, particularly for the study of proteins and metabolites. Here, the role of mass spectrometry in systems biology will be assessed, the advantages and disadvantages discussed, and the instrument configurations available described. Finally, general applications will be briefly reviewed. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Titan Science Return Quantification

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Lincoln, William

    2014-01-01

    Each proposal for a NASA mission concept includes a Science Traceability Matrix (STM), intended to show that what is being proposed would contribute to satisfying one or more of the agency's top-level science goals. But the information traditionally provided cannot be used directly to quantitatively compare anticipated science return. We added numerical elements to NASA's STM and developed a software tool to process the data. We then applied this methodology to evaluate a group of competing concepts for a proposed mission to Saturn's moon, Titan.

  6. Fingerprinting and quantification of GMOs in the agro-food sector.

    PubMed

    Taverniers, I; Van Bockstaele, E; De Loose, M

    2003-01-01

    Most strategies for analyzing GMOs in plants and derived food and feed products, are based on the polymerase chain reaction (PCR) technique. In conventional PCR methods, a 'known' sequence between two specific primers is amplified. To the contrary, with the 'anchor PCR' technique, unknown sequences adjacent to a known sequence, can be amplified. Because T-DNA/plant border sequences are being amplified, anchor PCR is the perfect tool for unique identification of transgenes, including non-authorized GMOs. In this work, anchor PCR was applied to characterize the 'transgene locus' and to clarify the complete molecular structure of at least six different commercial transgenic plants. Based on sequences of T-DNA/plant border junctions, obtained by anchor PCR, event specific primers were developed. The junction fragments, together with endogeneous reference gene targets, were cloned in plasmids. The latter were then used as event specific calibrators in real-time PCR, a new technique for the accurate relative quantification of GMOs. We demonstrate here the importance of anchor PCR for identification and the usefulness of plasmid DNA calibrators in quantification strategies for GMOs, throughout the agro-food sector.

  7. Quantification of pericardial effusions by echocardiography and computed tomography.

    PubMed

    Leibowitz, David; Perlman, Gidon; Planer, David; Gilon, Dan; Berman, Philip; Bogot, Naama

    2011-01-15

    Echocardiography is a well-accepted tool for the diagnosis and quantification of pericardial effusion (PEff). Given the increasing use of computed tomographic (CT) scanning, more PEffs are being initially diagnosed by computed tomography. No study has compared quantification of PEff by computed tomography and echocardiography. The objective of this study was to assess the accuracy of quantification of PEff by 2-dimensional echocardiography and computed tomography compared to the amount of pericardial fluid drained at pericardiocentesis. We retrospectively reviewed an institutional database to identify patients who underwent chest computed tomography and echocardiography before percutaneous pericardiocentesis with documentation of the amount of fluid withdrawn. Digital 2-dimensional echocardiographic and CT images were retrieved and quantification of PEff volume was performed by applying the formula for the volume of a prolate ellipse, π × 4/3 × maximal long-axis dimension/2 × maximal transverse dimension/2 × maximal anteroposterior dimension/2, to the pericardial sac and to the heart. Nineteen patients meeting study qualifications were entered into the study. The amount of PEff drained was 200 to 1,700 ml (mean 674 ± 340). Echocardiographically calculated pericardial effusion volume correlated relatively well with PEff volume (r = 0.73, p <0.001, mean difference -41 ± 225 ml). There was only moderate correlation between CT volume quantification and actual volume drained (r = 0.4, p = 0.004, mean difference 158 ± 379 ml). In conclusion, echocardiography appears a more accurate imaging technique than computed tomography in quantitative assessment of nonloculated PEffs and should continue to be the primary imaging in these patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  9. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  10. Epicocconone, a sensitive and specific fluorescent dye for in situ quantification of extracellular proteins within bacterial biofilms.

    PubMed

    Randrianjatovo, I; Girbal-Neuhauser, E; Marcato-Romain, C-E

    2015-06-01

    Biofilms are ecosystems of closely associated bacteria encapsulated in an extracellular matrix mainly composed of polysaccharides and proteins. A novel approach was developed for in situ quantification of extracellular proteins (ePNs) in various bacterial biofilms using epicocconone, a natural, fluorescent compound that binds amine residues of proteins. Six commercial proteins were tested for their reaction with epicocconone, and bovine serum albumin (BSA) was selected for assay optimization. The optimized protocol, performed as a microassay, allowed protein amounts as low as 0.7 μg to as high as 50 μg per well to be detected. Addition of monosaccharides or polysaccharides (glucose, dextran or alginate) to the standard BSA solutions (0 to 250 μg ml(-1)) showed little or no sugar interference up to 2000 μg ml(-1), thus providing an assessment of the specificity of epicocconone for proteins. The optimized protocol was then applied to three different biofilms, and in situ quantification of ePN showed contrasted protein amounts of 22.1 ± 3.1, 38.3 ± 7.1 and 0.3 ± 0.1 μg equivalent BSA of proteins for 48-h biofilms of Pseudomonas aeruginosa, Bacillus licheniformis and Weissella confusa, respectively. Possible interference due to global matrix compounds on the in situ quantification of proteins was also investigated by applying the standard addition method (SAM). Low error percentages were obtained, indicating a correct quantification of both the ePN and the added proteins. For the first time, a specific and sensitive assay has been developed for in situ determination of ePN produced by bacterial cells. This advance should lead to an accurate, rapid tool for further protein labelling and microscopic observation of the extracellular matrix of biofilms.

  11. An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe

    2015-12-26

    Comprehensive MS analysis of peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and their utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we set off by evaluating the results of several popular MS/MS database search engines including MS-GF+, SEQUEST and MS-Align+ for peptidomics data analysis, followed bymore » identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our result demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from the MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing value for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage than AMT. Taken together, we propose an optimal informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT) for identification and label-free quantification for high-throughput, comprehensive and quantitative peptidomics analysis.« less

  12. Meeting Report: Tissue-based Image Analysis.

    PubMed

    Saravanan, Chandra; Schumacher, Vanessa; Brown, Danielle; Dunstan, Robert; Galarneau, Jean-Rene; Odin, Marielle; Mishra, Sasmita

    2017-10-01

    Quantitative image analysis (IA) is a rapidly evolving area of digital pathology. Although not a new concept, the quantification of histological features on photomicrographs used to be cumbersome, resource-intensive, and limited to specialists and specialized laboratories. Recent technological advances like highly efficient automated whole slide digitizer (scanner) systems, innovative IA platforms, and the emergence of pathologist-friendly image annotation and analysis systems mean that quantification of features on histological digital images will become increasingly prominent in pathologists' daily professional lives. The added value of quantitative IA in pathology includes confirmation of equivocal findings noted by a pathologist, increasing the sensitivity of feature detection, quantification of signal intensity, and improving efficiency. There is no denying that quantitative IA is part of the future of pathology; however, there are also several potential pitfalls when trying to estimate volumetric features from limited 2-dimensional sections. This continuing education session on quantitative IA offered a broad overview of the field; a hands-on toxicologic pathologist experience with IA principles, tools, and workflows; a discussion on how to apply basic stereology principles in order to minimize bias in IA; and finally, a reflection on the future of IA in the toxicologic pathology field.

  13. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  14. In line NIR quantification of film thickness on pharmaceutical pellets during a fluid bed coating process.

    PubMed

    Lee, Min-Jeong; Seo, Da-Young; Lee, Hea-Eun; Wang, In-Chun; Kim, Woo-Sik; Jeong, Myung-Yung; Choi, Guang J

    2011-01-17

    Along with the risk-based approach, process analytical technology (PAT) has emerged as one of the key elements to fully implement QbD (quality-by-design). Near-infrared (NIR) spectroscopy has been extensively applied as an in-line/on-line analytical tool in biomedical and chemical industries. In this study, the film thickness on pharmaceutical pellets was examined for quantification using in-line NIR spectroscopy during a fluid-bed coating process. A precise monitoring of coating thickness and its prediction with a suitable control strategy is crucial to the quality assurance of solid dosage forms including dissolution characteristics. Pellets of a test formulation were manufactured and coated in a fluid-bed by spraying a hydroxypropyl methylcellulose (HPMC) coating solution. NIR spectra were acquired via a fiber-optic probe during the coating process, followed by multivariate analysis utilizing partial least squares (PLS) calibration models. The actual coating thickness of pellets was measured by two separate methods, confocal laser scanning microscopy (CLSM) and laser diffraction particle size analysis (LD-PSA). Both characterization methods gave superb correlation results, and all determination coefficient (R(2)) values exceeded 0.995. In addition, a prediction coating experiment for 70min demonstrated that the end-point can be accurately designated via NIR in-line monitoring with appropriate calibration models. In conclusion, our approach combining in-line NIR monitoring with CLSM and LD-PSA can be applied as an effective PAT tool for fluid-bed pellet coating processes. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. The use of SWOT analysis to explore and prioritize conservation and development strategies for local cattle breeds.

    PubMed

    Martín-Collado, D; Díaz, C; Mäki-Tanila, A; Colinet, F; Duclos, D; Hiemstra, S J; Gandini, G

    2013-06-01

    SWOT (Strengths, Weaknesses, Opportunities and Threats) analysis is a tool widely used to help in decision making in complex systems. It suits to exploring the issues and measures related to the conservation and development of local breeds, as it allows the integration of many driving factors influencing breed dynamics. We developed a quantified SWOT method as a decision-making tool for identification and ranking of conservation and development strategies of local breeds, and applied it to a set of 13 cattle breeds of six European countries. The method has four steps: definition of the system, identification and grouping of the driving factors, quantification of the importance of driving factors and identification and prioritization of the strategies. The factors were determined following a multi-stakeholder approach and grouped with a three-level structure. Animal genetic resources expert groups ranked the factors, and a quantification process was implemented to identify and prioritize strategies. The proposed SWOT methodology allows analyzing the dynamics of local cattle breeds in a structured and systematic way. It is a flexible tool developed to assist different stakeholders in defining the strategies and actions. The quantification process allows the comparison of the driving factors and the prioritization of the strategies for the conservation and development of local cattle breeds. We identified 99 factors across the breeds. Although the situation is very heterogeneous, the future of these breeds may be promising. The most important strengths and weaknesses were related to production systems and farmers. The most important opportunities were found in marketing new products, whereas the most relevant threats were found in selling the current products. The across-breed strategies utility decreased as they gained specificity. Therefore, the strategies at European level should focus on general aspects and be flexible enough to be adapted to the country and breed specificities.

  16. Connected component analysis of review-SEM images for sub-10nm node process verification

    NASA Astrophysics Data System (ADS)

    Halder, Sandip; Leray, Philippe; Sah, Kaushik; Cross, Andrew; Parisi, Paolo

    2017-03-01

    Analysis of hotspots is becoming more and more critical as we scale from node to node. To define true process windows at sub-14 nm technology nodes, often defect inspections are being included to weed out design weak spots (often referred to as hotspots). Defect inspection sub 28 nm nodes is a two pass process. Defect locations identified by optical inspection tools need to be reviewed by review-SEM's to understand exactly which feature is failing in the region flagged by the optical tool. The images grabbed by the review-SEM tool are used for classification but rarely for quantification. The goal of this paper is to see if the thousands of review-SEM images which are existing can be used for quantification and further analysis. More specifically we address the SEM quantification problem with connected component analysis.

  17. Quantification of collagen contraction in three-dimensional cell culture.

    PubMed

    Kopanska, Katarzyna S; Bussonnier, Matthias; Geraldo, Sara; Simon, Anthony; Vignjevic, Danijela; Betz, Timo

    2015-01-01

    Many different cell types including fibroblasts, smooth muscle cells, endothelial cells, and cancer cells exert traction forces on the fibrous components of the extracellular matrix. This can be observed as matrix contraction both macro- and microscopically in three-dimensional (3D) tissues models such as collagen type I gels. The quantification of local contraction at the micron scale, including its directionality and speed, in correlation with other parameters such as cell invasion, local protein or gene expression, can provide useful information to study wound healing, organism development, and cancer metastasis. In this article, we present a set of tools to quantify the flow dynamics of collagen contraction, induced by cells migrating out of a multicellular cancer spheroid into a three-dimensional (3D) collagen matrix. We adapted a pseudo-speckle technique that can be applied to bright-field and fluorescent microscopy time series. The image analysis presented here is based on an in-house written software developed in the Matlab (Mathworks) programming environment. The analysis program is freely available from GitHub following the link: http://dx.doi.org/10.5281/zenodo.10116. This tool provides an automatized technique to measure collagen contraction that can be utilized in different 3D cellular systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    PubMed

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  19. Automated three-dimensional quantification of myocardial perfusion and brain SPECT.

    PubMed

    Slomka, P J; Radau, P; Hurwitz, G A; Dey, D

    2001-01-01

    To allow automated and objective reading of nuclear medicine tomography, we have developed a set of tools for clinical analysis of myocardial perfusion tomography (PERFIT) and Brain SPECT/PET (BRASS). We exploit algorithms for image registration and use three-dimensional (3D) "normal models" for individual patient comparisons to composite datasets on a "voxel-by-voxel basis" in order to automatically determine the statistically significant abnormalities. A multistage, 3D iterative inter-subject registration of patient images to normal templates is applied, including automated masking of the external activity before final fit. In separate projects, the software has been applied to the analysis of myocardial perfusion SPECT, as well as brain SPECT and PET data. Automatic reading was consistent with visual analysis; it can be applied to the whole spectrum of clinical images, and aid physicians in the daily interpretation of tomographic nuclear medicine images.

  20. Four human Plasmodium species quantification using droplet digital PCR.

    PubMed

    Srisutham, Suttipat; Saralamba, Naowarat; Malleret, Benoit; Rénia, Laurent; Dondorp, Arjen M; Imwong, Mallika

    2017-01-01

    Droplet digital polymerase chain reaction (ddPCR) is a partial PCR based on water-oil emulsion droplet technology. It is a highly sensitive method for detecting and delineating minor alleles from complex backgrounds and provides absolute quantification of DNA targets. The ddPCR technology has been applied for detection of many pathogens. Here the sensitive assay utilizing ddPCR for detection and quantification of Plasmodium species was investigated. The assay was developed for two levels of detection, genus specific for all Plasmodium species and for specific Plasmodium species detection. The ddPCR assay was developed based on primers and probes specific to the Plasmodium genus 18S rRNA gene. Using ddPCR for ultra-sensitive P. falciparum assessment, the lower level of detection from concentrated DNA obtained from a high volume (1 mL) blood sample was 11 parasites/mL. For species identification, in particular for samples with mixed infections, a duplex reaction was developed for detection and quantification P. falciparum/ P. vivax and P. malariae/ P. ovale. Amplification of each Plasmodium species in the duplex reaction showed equal sensitivity to singleplex single species detection. The duplex ddPCR assay had higher sensitivity to identify minor species in 32 subpatent parasitaemia samples from Cambodia, and performed better than real-time PCR. The ddPCR assay shows high sensitivity to assess very low parasitaemia of all human Plasmodium species. This provides a useful research tool for studying the role of the asymptomatic parasite reservoir for transmission in regions aiming for malaria elimination.

  1. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    NASA Astrophysics Data System (ADS)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  2. Quantitative bioimaging of p-boronophenylalanine in thin liver tissue sections as a tool for treatment planning in boron neutron capture therapy.

    PubMed

    Reifschneider, Olga; Schütz, Christian L; Brochhausen, Christoph; Hampel, Gabriele; Ross, Tobias; Sperling, Michael; Karst, Uwe

    2015-03-01

    An analytical method using laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) was developed and applied to assess enrichment of 10B-containing p-boronophenylalanine-fructose (BPA-f) and its pharmacokinetic distribution in human tissues after application for boron neutron capture therapy (BNCT). High spatial resolution (50 μm) and limits of detection in the low parts-per-billion range were achieved using a Nd:YAG laser of 213 nm wavelength. External calibration by means of 10B-enriched standards based on whole blood proved to yield precise quantification results. Using this calibration method, quantification of 10B in cancerous and healthy tissue was carried out. Additionally, the distribution of 11B was investigated, providing 10B enrichment in the investigated tissues. Quantitative imaging of 10B by means of LA-ICP-MS was demonstrated as a new option to characterise the efficacy of boron compounds for BNCT.

  3. Fluorophores, environments, and quantification techniques in the analysis of transmembrane helix interaction using FRET.

    PubMed

    Khadria, Ambalika S; Senes, Alessandro

    2015-07-01

    Förster resonance energy transfer (FRET) has been widely used as a spectroscopic tool in vitro to study the interactions between transmembrane (TM) helices in detergent and lipid environments. This technique has been instrumental to many studies that have greatly contributed to quantitative understanding of the physical principles that govern helix-helix interactions in the membrane. These studies have also improved our understanding of the biological role of oligomerization in membrane proteins. In this review, we focus on the combinations of fluorophores used, the membrane mimetic environments, and measurement techniques that have been applied to study model systems as well as biological oligomeric complexes in vitro. We highlight the different formalisms used to calculate FRET efficiency and the challenges associated with accurate quantification. The goal is to provide the reader with a comparative summary of the relevant literature for planning and designing FRET experiments aimed at measuring TM helix-helix associations. © 2015 Wiley Periodicals, Inc.

  4. Fast quantification of bovine milk proteins employing external cavity-quantum cascade laser spectroscopy.

    PubMed

    Schwaighofer, Andreas; Kuligowski, Julia; Quintás, Guillermo; Mayer, Helmut K; Lendl, Bernhard

    2018-06-30

    Analysis of proteins in bovine milk is usually tackled by time-consuming analytical approaches involving wet-chemical, multi-step sample clean-up procedures. The use of external cavity-quantum cascade laser (EC-QCL) based IR spectroscopy was evaluated as an alternative screening tool for direct and simultaneous quantification of individual proteins (i.e. casein and β-lactoglobulin) and total protein content in commercial bovine milk samples. Mid-IR spectra of protein standard mixtures were used for building partial least squares (PLS) regression models. A sample set comprising different milk types (pasteurized; differently processed extended shelf life, ESL; ultra-high temperature, UHT) was analysed and results were compared to reference methods. Concentration values of the QCL-IR spectroscopy approach obtained within several minutes are in good agreement with reference methods involving multiple sample preparation steps. The potential application as a fast screening method for estimating the heat load applied to liquid milk is demonstrated. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  6. 39 CFR 3050.1 - Definitions applicable to this part.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... was applied by the Commission in its most recent Annual Compliance Determination unless a different analytical principle subsequently was accepted by the Commission in a final rule. (b) Accepted quantification technique refers to a quantification technique that was applied in the most recent iteration of the periodic...

  7. 39 CFR 3050.1 - Definitions applicable to this part.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... was applied by the Commission in its most recent Annual Compliance Determination unless a different analytical principle subsequently was accepted by the Commission in a final rule. (b) Accepted quantification technique refers to a quantification technique that was applied in the most recent iteration of the periodic...

  8. Reliable estimates of predictive uncertainty for an Alpine catchment using a non-parametric methodology

    NASA Astrophysics Data System (ADS)

    Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.

    2017-04-01

    Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.

  9. Real-time PCR for detection and quantification of the biocontrol agent Trichoderma atroviride strain SC1 in soil.

    PubMed

    Savazzini, Federica; Longa, Claudia Maria Oliveira; Pertot, Ilaria; Gessler, Cesare

    2008-05-01

    Trichoderma (Hypocreales, Ascomycota) is a widespread genus in nature and several Trichoderma species are used in industrial processes and as biocontrol agents against crop diseases. It is very important that the persistence and spread of microorganisms released on purpose into the environment are accurately monitored. Real-time PCR methods for genus/species/strain identification of microorganisms are currently being developed to overcome the difficulties of classical microbiological and enzymatic methods for monitoring these populations. The aim of the present study was to develop and validate a specific real-time PCR-based method for detecting Trichoderma atroviride SC1 in soil. We developed a primer and TaqMan probe set constructed on base mutations in an endochitinase gene. This tool is highly specific for the detection and quantification of the SC1 strain. The limits of detection and quantification calculated from the relative standard deviation were 6000 and 20,000 haploid genome copies per gram of soil. Together with the low throughput time associated with this procedure, which allows the evaluation of many soil samples within a short time period, these results suggest that this method could be successfully used to trace the fate of T. atroviride SC1 applied as an open-field biocontrol agent.

  10. Quantitative real-time PCR approaches for microbial community studies in wastewater treatment systems: applications and considerations.

    PubMed

    Kim, Jaai; Lim, Juntaek; Lee, Changsoo

    2013-12-01

    Quantitative real-time PCR (qPCR) has been widely used in recent environmental microbial ecology studies as a tool for detecting and quantifying microorganisms of interest, which aids in better understandings of the complexity of wastewater microbial communities. Although qPCR can be used to provide more specific and accurate quantification than other molecular techniques, it does have limitations that must be considered when applying it in practice. This article reviews the principle of qPCR quantification and its applications to microbial ecology studies in various wastewater treatment environments. Here we also address several limitations of qPCR-based approaches that can affect the validity of quantification data: template nucleic acid quality, nucleic acid extraction efficiency, specificity of group-specific primers and probes, amplification of nonviable DNA, gene copy number variation, and limited number of sequences in the database. Even with such limitations, qPCR is reportedly among the best methods for quantitatively investigating environmental microbial communities. The application of qPCR is and will continue to be increasingly common in studies of wastewater treatment systems. To obtain reliable analyses, however, the limitations that have often been overlooked must be carefully considered when interpreting the results. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Digital Quantification of Goldmann Visual Fields (GVF) as a Means for Genotype-Phenotype Comparisons and Detection of Progression in Retinal Degenerations

    PubMed Central

    Zahid, Sarwar; Peeler, Crandall; Khan, Naheed; Davis, Joy; Mahmood, Mahdi; Heckenlively, John; Jayasundera, Thiran

    2015-01-01

    Purpose To develop a reliable and efficient digital method to quantify planimetric Goldmann visual field (GVF) data to monitor disease course and treatment responses in retinal degenerative diseases. Methods A novel method to digitally quantify GVF using Adobe Photoshop CS3 was developed for comparison to traditional digital planimetry (Placom 45C digital planimeter; EngineerSupply, Lynchburg, Virginia, USA). GVFs from 20 eyes from 10 patients with Stargardt disease were quantified to assess the difference between the two methods (a total of 230 measurements per method). This quantification approach was also applied to 13 patients with X-linked retinitis pigmentosa (XLRP) with mutations in RPGR. Results Overall, measurements using Adobe Photoshop were more rapidly performed than those using conventional planimetry. Photoshop measurements also exhibited less inter- and intra-observer variability. GVF areas for the I4e isopter in patients with the same mutation in RPGR who were nearby in age had similar qualitative and quantitative areas. Conclusions Quantification of GVF using Adobe Photoshop is quicker, more reliable, and less-user dependent than conventional digital planimetry. It will be a useful tool for both retrospective and prospective studies of disease course as well as for monitoring treatment response in clinical trials for retinal degenerative diseases. PMID:24664690

  12. Spectral Analysis of Dynamic PET Studies: A Review of 20 Years of Method Developments and Applications.

    PubMed

    Veronese, Mattia; Rizzo, Gaia; Bertoldo, Alessandra; Turkheimer, Federico E

    2016-01-01

    In Positron Emission Tomography (PET), spectral analysis (SA) allows the quantification of dynamic data by relating the radioactivity measured by the scanner in time to the underlying physiological processes of the system under investigation. Among the different approaches for the quantification of PET data, SA is based on the linear solution of the Laplace transform inversion whereas the measured arterial and tissue time-activity curves of a radiotracer are used to calculate the input response function of the tissue. In the recent years SA has been used with a large number of PET tracers in brain and nonbrain applications, demonstrating that it is a very flexible and robust method for PET data analysis. Differently from the most common PET quantification approaches that adopt standard nonlinear estimation of compartmental models or some linear simplifications, SA can be applied without defining any specific model configuration and has demonstrated very good sensitivity to the underlying kinetics. This characteristic makes it useful as an investigative tool especially for the analysis of novel PET tracers. The purpose of this work is to offer an overview of SA, to discuss advantages and limitations of the methodology, and to inform about its applications in the PET field.

  13. Digital quantification of Goldmann visual fields (GVFs) as a means for genotype-phenotype comparisons and detection of progression in retinal degenerations.

    PubMed

    Zahid, Sarwar; Peeler, Crandall; Khan, Naheed; Davis, Joy; Mahmood, Mahdi; Heckenlively, John R; Jayasundera, Thiran

    2014-01-01

    To develop a reliable and efficient digital method to quantify planimetric Goldmann visual field (GVF) data to monitor disease course and treatment responses in retinal degenerative diseases. A novel method to digitally quantify GVFs using Adobe Photoshop CS3 was developed for comparison to traditional digital planimetry (Placom 45C digital planimeter; Engineer Supply, Lynchburg, Virginia, USA). GVFs from 20 eyes from 10 patients with Stargardt disease were quantified to assess the difference between the two methods (a total of 230 measurements per method). This quantification approach was also applied to 13 patients with X-linked retinitis pigmentosa (XLRP) with mutations in RPGR. Overall, measurements using Adobe Photoshop were more rapidly performed than those using conventional planimetry. Photoshop measurements also exhibited less inter- and intraobserver variability. GVF areas for the I4e isopter in patients with the same mutation in RPGR who were nearby in age had similar qualitative and quantitative areas. Quantification of GVFs using Adobe Photoshop is quicker, more reliable, and less user dependent than conventional digital planimetry. It will be a useful tool for both retrospective and prospective studies of disease course as well as for monitoring treatment response in clinical trials for retinal degenerative diseases.

  14. Improved Method for the Detection and Quantification of Naegleria fowleri in Water and Sediment Using Immunomagnetic Separation and Real-Time PCR

    PubMed Central

    Mull, Bonnie J.; Narayanan, Jothikumar; Hill, Vincent R.

    2013-01-01

    Primary amebic meningoencephalitis (PAM) is a rare and typically fatal infection caused by the thermophilic free-living ameba, Naegleria fowleri. In 2010, the first confirmed case of PAM acquired in Minnesota highlighted the need for improved detection and quantification methods in order to study the changing ecology of N. fowleri and to evaluate potential risk factors for increased exposure. An immunomagnetic separation (IMS) procedure and real-time PCR TaqMan assay were developed to recover and quantify N. fowleri in water and sediment samples. When one liter of lake water was seeded with N. fowleri strain CDC:V212, the method had an average recovery of 46% and detection limit of 14 amebas per liter of water. The method was then applied to sediment and water samples with unknown N. fowleri concentrations, resulting in positive direct detections by real-time PCR in 3 out of 16 samples and confirmation of N. fowleri culture in 6 of 16 samples. This study has resulted in a new method for detection and quantification of N. fowleri in water and sediment that should be a useful tool to facilitate studies of the physical, chemical, and biological factors associated with the presence and dynamics of N. fowleri in environmental systems. PMID:24228172

  15. Novel method to detect microRNAs using chip-based QuantStudio 3D digital PCR.

    PubMed

    Conte, Davide; Verri, Carla; Borzi, Cristina; Suatoni, Paola; Pastorino, Ugo; Sozzi, Gabriella; Fortunato, Orazio

    2015-10-23

    Research efforts for the management of cancer, in particular for lung cancer, are directed to identify new strategies for its early detection. MicroRNAs (miRNAs) are a new promising class of circulating biomarkers for cancer detection, but lack of consensus on data normalization methods has affected the diagnostic potential of circulating miRNAs. There is a growing interest in techniques that allow an absolute quantification of miRNAs which could be useful for early diagnosis. Recently, digital PCR, mainly based on droplets generation, emerged as an affordable technology for precise and absolute quantification of nucleic acids. In this work, we described a new interesting approach for profiling circulating miRNAs in plasma samples using a chip-based platform, the QuantStudio 3D digital PCR. The proposed method was validated using synthethic oligonucleotide at serial dilutions in plasma samples of lung cancer patients and in lung tissues and cell lines. Given its reproducibility and reliability, our approach could be potentially applied for the identification and quantification of miRNAs in other biological samples such as circulating exosomes or protein complexes. As chip-digital PCR becomes more established, it would be a robust tool for quantitative assessment of miRNA copy number for diagnosis of lung cancer and other diseases.

  16. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  17. Recurrence quantification analysis applied to spatiotemporal pattern analysis in high-density mapping of human atrial fibrillation.

    PubMed

    Zeemering, Stef; Bonizzi, Pietro; Maesen, Bart; Peeters, Ralf; Schotten, Ulrich

    2015-01-01

    Spatiotemporal complexity of atrial fibrillation (AF) patterns is often quantified by annotated intracardiac contact mapping. We introduce a new approach that applies recurrence plot (RP) construction followed by recurrence quantification analysis (RQA) to epicardial atrial electrograms, recorded with a high-density grid of electrodes. In 32 patients with no history of AF (aAF, n=11), paroxysmal AF (PAF, n=12) and persistent AF (persAF, n=9), RPs were constructed using a phase space electrogram embedding dimension equal to the estimated AF cycle length. Spatial information was incorporated by 1) averaging the recurrence over all electrodes, and 2) by applying principal component analysis (PCA) to the matrix of embedded electrograms and selecting the first principal component as a representation of spatial diversity. Standard RQA parameters were computed on the constructed RPs and correlated to the number of fibrillation waves per AF cycle (NW). Averaged RP RQA parameters showed no correlation with NW. Correlations improved when applying PCA, with maximum correlation achieved between RP threshold and NW (RR1%, r=0.68, p <; 0.001) and RP determinism (DET, r=-0.64, p <; 0.001). All studied RQA parameters based on the PCA RP were able to discriminate between persAF and aAF/PAF (DET persAF 0.40 ± 0.11 vs. 0.59 ± 0.14/0.62 ± 0.16, p <; 0.01). RP construction and RQA combined with PCA provide a quick and reliable tool to visualize dynamical behaviour and to assess the complexity of contact mapping patterns in AF.

  18. Determination of Zn/Cu ratio and oligoelements in serum samples by total reflection X-ray fluorescence spectrometry for cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Marcó P., L. M.; Jiménez, E.; Hernández C., E. A.; Rojas, A.; Greaves, E. D.

    2001-11-01

    The method of quantification using the Compton peak as an internal standard, developed in a previous work, was applied to the routine determination of Fe, Cu, Zn and Se in serum samples from normal individuals and cancer patients by total reflection X-ray fluorescence spectrometry. Samples were classified according to age and sex of the donor, in order to determine reference values for normal individuals. Results indicate that the Zn/Cu ratio and the Cu concentration could prove to be useful tools for cancer diagnosis. Significant differences in these parameters between the normal and cancer group were found for all age ranges. The multielemental character of the technique, coupled with the small amounts of sample required and the short analysis time make it a valuable tool in clinical analysis.

  19. Subvisible (2-100 μm) Particle Analysis During Biotherapeutic Drug Product Development: Part 1, Considerations and Strategy.

    PubMed

    Narhi, Linda O; Corvari, Vincent; Ripple, Dean C; Afonina, Nataliya; Cecchini, Irene; Defelippis, Michael R; Garidel, Patrick; Herre, Andrea; Koulov, Atanas V; Lubiniecki, Tony; Mahler, Hanns-Christian; Mangiagalli, Paolo; Nesta, Douglas; Perez-Ramirez, Bernardo; Polozova, Alla; Rossi, Mara; Schmidt, Roland; Simler, Robert; Singh, Satish; Spitznagel, Thomas M; Weiskopf, Andrew; Wuchner, Klaus

    2015-06-01

    Measurement and characterization of subvisible particles (defined here as those ranging in size from 2 to 100 μm), including proteinaceous and nonproteinaceous particles, is an important part of every stage of protein therapeutic development. The tools used and the ways in which the information generated is applied depends on the particular product development stage, the amount of material, and the time available for the analysis. In order to compare results across laboratories and products, it is important to harmonize nomenclature, experimental protocols, data analysis, and interpretation. In this manuscript on perspectives on subvisible particles in protein therapeutic drug products, we focus on the tools available for detection, characterization, and quantification of these species and the strategy around their application. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  20. Technical advances in proteomics: new developments in data-independent acquisition.

    PubMed

    Hu, Alex; Noble, William S; Wolf-Yadlin, Alejandro

    2016-01-01

    The ultimate aim of proteomics is to fully identify and quantify the entire complement of proteins and post-translational modifications in biological samples of interest. For the last 15 years, liquid chromatography-tandem mass spectrometry (LC-MS/MS) in data-dependent acquisition (DDA) mode has been the standard for proteomics when sampling breadth and discovery were the main objectives; multiple reaction monitoring (MRM) LC-MS/MS has been the standard for targeted proteomics when precise quantification, reproducibility, and validation were the main objectives. Recently, improvements in mass spectrometer design and bioinformatics algorithms have resulted in the rediscovery and development of another sampling method: data-independent acquisition (DIA). DIA comprehensively and repeatedly samples every peptide in a protein digest, producing a complex set of mass spectra that is difficult to interpret without external spectral libraries. Currently, DIA approaches the identification breadth of DDA while achieving the reproducible quantification characteristic of MRM or its newest version, parallel reaction monitoring (PRM). In comparative de novo identification and quantification studies in human cell lysates, DIA identified up to 89% of the proteins detected in a comparable DDA experiment while providing reproducible quantification of over 85% of them. DIA analysis aided by spectral libraries derived from prior DIA experiments or auxiliary DDA data produces identification and quantification as reproducible and precise as that achieved by MRM/PRM, except on low‑abundance peptides that are obscured by stronger signals. DIA is still a work in progress toward the goal of sensitive, reproducible, and precise quantification without external spectral libraries. New software tools applied to DIA analysis have to deal with deconvolution of complex spectra as well as proper filtering of false positives and false negatives. However, the future outlook is positive, and various researchers are working on novel bioinformatics techniques to address these issues and increase the reproducibility, fidelity, and identification breadth of DIA.

  1. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    PubMed

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.

  2. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    PubMed

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  3. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    USGS Publications Warehouse

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  4. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    PubMed

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the experimental conditions are drawn. This work shows that SPE is a convenient technique for TBT pre-concentration at pico-trace levels and a robust approach: in fact (i) number of different experimental conditions led to satisfactory results and (ii) the participation of two institutes to the experimental work did not impact the developed model. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. CEQer: a graphical tool for copy number and allelic imbalance detection from whole-exome sequencing data.

    PubMed

    Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo

    2013-01-01

    Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data.

  6. Functional assessment of human enhancer activities using whole-genome STARR-sequencing.

    PubMed

    Liu, Yuwen; Yu, Shan; Dhiman, Vineet K; Brunetti, Tonya; Eckart, Heather; White, Kevin P

    2017-11-20

    Genome-wide quantification of enhancer activity in the human genome has proven to be a challenging problem. Recent efforts have led to the development of powerful tools for enhancer quantification. However, because of genome size and complexity, these tools have yet to be applied to the whole human genome.  In the current study, we use a human prostate cancer cell line, LNCaP as a model to perform whole human genome STARR-seq (WHG-STARR-seq) to reliably obtain an assessment of enhancer activity. This approach builds upon previously developed STARR-seq in the fly genome and CapSTARR-seq techniques in targeted human genomic regions. With an improved library preparation strategy, our approach greatly increases the library complexity per unit of starting material, which makes it feasible and cost-effective to explore the landscape of regulatory activity in the much larger human genome. In addition to our ability to identify active, accessible enhancers located in open chromatin regions, we can also detect sequences with the potential for enhancer activity that are located in inaccessible, closed chromatin regions. When treated with the histone deacetylase inhibitor, Trichostatin A, genes nearby this latter class of enhancers are up-regulated, demonstrating the potential for endogenous functionality of these regulatory elements. WHG-STARR-seq provides an improved approach to current pipelines for analysis of high complexity genomes to gain a better understanding of the intricacies of transcriptional regulation.

  7. Ultrasensitive liquid chromatography-tandem mass spectrometric methodologies for quantification of five HIV-1 integrase inhibitors in plasma for a microdose clinical trial.

    PubMed

    Sun, Li; Li, Hankun; Willson, Kenneth; Breidinger, Sheila; Rizk, Matthew L; Wenning, Larissa; Woolf, Eric J

    2012-10-16

    HIV-1 integrase strand transfer inhibitors are an important class of compounds targeted for the treatment of HIV-1 infection. Microdosing has emerged as an attractive tool to assist in drug candidate screening for clinical development, but necessitates extremely sensitive bioanalytical assays, typically in the pg/mL concentration range. Currently, accelerator mass spectrometry is the predominant tool for microdosing support, which requires a specialized facility and synthesis of radiolabeled compounds. There have been few studies attempted to comprehensively assess a liquid chromatography-tandem mass spectrometry (LC-MS/MS) approach in the context of microdosing applications. Herein, we describe the development of automated LC-MS/MS methods to quantify five integrase inhibitors in plasma with the limits of quantification at 1 pg/mL for raltegravir and 2 pg/mL for four proprietary compounds. The assays involved double extractions followed by UPLC coupled with negative ion electrospray MS/MS analysis. All methods were fully validated to the rigor of regulated bioanalysis requirements, with intraday precision between 1.20 and 14.1% and accuracy between 93.8 and 107% at the standard curve concentration range. These methods were successfully applied to a human microdose study and demonstrated to be accurate, reproducible, and cost-effective. Results of the study indicate that raltegravir displayed linear pharmacokinetics between a microdose and a pharmacologically active dose.

  8. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  9. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    NASA Technical Reports Server (NTRS)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  10. Hampton Roads climate impact quantification initiative : baseline assessment of the transportation assets & overview of economic analyses useful in quantifying impacts

    DOT National Transportation Integrated Search

    2016-09-13

    The Hampton Roads Climate Impact Quantification Initiative (HRCIQI) is a multi-part study sponsored by the U.S. Department of Transportation (DOT) Climate Change Center with the goals that include developing a cost tool that provides methods for volu...

  11. Gearbox damage identification and quantification using stochastic resonance

    NASA Astrophysics Data System (ADS)

    Mba, Clement U.; Marchesiello, Stefano; Fasana, Alessandro; Garibaldi, Luigi

    2018-03-01

    Amongst the many new tools used for vibration based mechanical fault diagnosis in rotating machineries, stochastic resonance (SR) has been shown to be able to identify as well as quantify gearbox damage via numerical simulations. To validate the numerical simulation results that were obtained in a previous work by the authors, SR is applied in the present study to data from an experimental gearbox that is representative of an industrial gearbox. Both spur and helical gears are used in the gearbox setup. While the results of the direct application of SR to experimental data do not exactly corroborate the numerical simulation results, applying SR to experimental data in pre-processed form is shown to be quite effective. In addition, it is demonstrated that traditional statistical techniques used for gearbox diagnosis can be used as a reference to check how well SR performs.

  12. Qualitative and quantitative analysis of monomers in polyesters for food contact materials.

    PubMed

    Brenz, Fabrian; Linke, Susanne; Simat, Thomas

    2017-02-01

    Polyesters (PESs) are gaining more importance on the food contact material (FCM) market and the variety of properties and applications is expected to be wide. In order to acquire the desired properties manufacturers can combine several FCM-approved polyvalent carboxylic acids (PCAs) and polyols as monomers. However, information about the qualitative and quantitative composition of FCM articles is often limited. The method presented here describes the analysis of PESs with the identification and quantification of 25 PES monomers (10 PCA, 15 polyols) by HPLC with diode array detection (HPLC-DAD) and GC-MS after alkaline hydrolysis. Accurate identification and quantification were demonstrated by the analysis of seven different FCM articles made of PESs. The results explained between 97.2% and 103.4% w/w of the polymer composition whilst showing equal molar amounts of PCA and polyols. Quantification proved to be precise and sensitive with coefficients of variation (CVs) below 6.0% for PES samples with monomer concentrations typically ranging from 0.02% to 75% w/w. The analysis of 15 PES samples for the FCM market revealed the presence of five different PCAs and 11 different polyols (main monomers, co-monomers, non-intentionally added substances (NIAS)) showing the wide variety of monomers in modern PESs. The presented method provides a useful tool for commercial, state and research laboratories as well as for producers and distributors facing the task of FCM risk assessment. It can be applied for the identification and quantification of migrating monomers and the prediction of oligomer compositions from the identified monomers, respectively.

  13. Non-destructive evaluation of porosity and its effect on mechanical properties of carbon fiber reinforced polymer composite materials

    NASA Astrophysics Data System (ADS)

    Bhat, M. R.; Binoy, M. P.; Surya, N. M.; Murthy, C. R. L.; Engelbart, R. W.

    2012-05-01

    In this work, an attempt is made to induce porosity of varied levels in carbon fiber reinforced epoxy based polymer composite laminates fabricated using prepregs by varying the fabrication parameters such as applied vacuum, autoclave pressure and curing temperature. Different NDE tools have been utilized to evaluate the porosity content and correlate with measurable parameters of different NDE techniques. Primarily, ultrasonic imaging and real time digital X-ray imaging have been tried to obtain a measurable parameter which can represent or reflect the amount of porosity contained in the composite laminate. Also, effect of varied porosity content on mechanical properties of the CFRP composite materials is investigated through a series of experimental investigations. The outcome of the experimental approach has yielded interesting and encouraging trend as a first step towards developing an NDE tool for quantification of effect of varied porosity in the polymer composite materials.

  14. Quantification and spatial characterization of moisture and NaCl content of Iberian dry-cured ham slices using NIR hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Hyperspectral imaging technology is increasingly regarded as a powerful tool for the classification and spatial quantification of a wide range of agrofood product properties. Taking into account the difficulties involved in validating hyperspectral calibrations, the models constructed here proved mo...

  15. LipidMiner: A Software for Automated Identification and Quantification of Lipids from Multiple Liquid Chromatography-Mass Spectrometry Data Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Da; Zhang, Qibin; Gao, Xiaoli

    2014-04-30

    We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less

  16. Health impact assessment – A survey on quantifying tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org

    Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less

  17. A highly efficient, high-throughput lipidomics platform for the quantitative detection of eicosanoids in human whole blood.

    PubMed

    Song, Jiao; Liu, Xuejun; Wu, Jiejun; Meehan, Michael J; Blevitt, Jonathan M; Dorrestein, Pieter C; Milla, Marcos E

    2013-02-15

    We have developed an ultra-performance liquid chromatography-multiple reaction monitoring/mass spectrometry (UPLC-MRM/MS)-based, high-content, high-throughput platform that enables simultaneous profiling of multiple lipids produced ex vivo in human whole blood (HWB) on treatment with calcium ionophore and its modulation with pharmacological agents. HWB samples were processed in a 96-well plate format compatible with high-throughput sample processing instrumentation. We employed a scheduled MRM (sMRM) method, with a triple-quadrupole mass spectrometer coupled to a UPLC system, to measure absolute amounts of 122 distinct eicosanoids using deuterated internal standards. In a 6.5-min run, we resolved and detected with high sensitivity (lower limit of quantification in the range of 0.4-460 pg) all targeted analytes from a very small HWB sample (2.5 μl). Approximately 90% of the analytes exhibited a dynamic range exceeding 1000. We also developed a tailored software package that dramatically sped up the overall data quantification and analysis process with superior consistency and accuracy. Matrix effects from HWB and precision of the calibration curve were evaluated using this newly developed automation tool. This platform was successfully applied to the global quantification of changes on all 122 eicosanoids in HWB samples from healthy donors in response to calcium ionophore stimulation. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. miR-MaGiC improves quantification accuracy for small RNA-seq.

    PubMed

    Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina

    2018-05-15

    Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.

  19. The sweet and sour of serological glycoprotein tumor biomarker quantification

    PubMed Central

    2013-01-01

    Aberrant and dysregulated protein glycosylation is a well-established event in the process of oncogenesis and cancer progression. Years of study on the glycobiology of cancer have been focused on the development of clinically viable diagnostic applications of this knowledge. However, for a number of reasons, there has been only sparse and varied success. The causes of this range from technical to biological issues that arise when studying protein glycosylation and attempting to apply it to practical applications. This review focuses on the pitfalls, advances, and future directions to be taken in the development of clinically applicable quantitative assays using glycan moieties from serum-based proteins as analytes. Topics covered include the development and progress of applications of lectins, mass spectrometry, and other technologies towards this purpose. Slowly but surely, novel applications of established and development of new technologies will eventually provide us with the tools to reach the ultimate goal of quantification of the full scope of heterogeneity associated with the glycosylation of biomarker candidate glycoproteins in a clinically applicable fashion. PMID:23390961

  20. Rapid quantification of multi-components in alcohol precipitation liquid of Codonopsis Radix using near infrared spectroscopy (NIRS).

    PubMed

    Luo, Yu; Li, Wen-Long; Huang, Wen-Hua; Liu, Xue-Hua; Song, Yan-Gang; Qu, Hai-Bin

    2017-05-01

    A near infrared spectroscopy (NIRS) approach was established for quality control of the alcohol precipitation liquid in the manufacture of Codonopsis Radix. By applying NIRS with multivariate analysis, it was possible to build variation into the calibration sample set, and the Plackett-Burman design, Box-Behnken design, and a concentrating-diluting method were used to obtain the sample set covered with sufficient fluctuation of process parameters and extended concentration information. NIR data were calibrated to predict the four quality indicators using partial least squares regression (PLSR). In the four calibration models, the root mean squares errors of prediction (RMSEPs) were 1.22 μg/ml, 10.5 μg/ml, 1.43 μg/ml, and 0.433% for lobetyolin, total flavonoids, pigments, and total solid contents, respectively. The results indicated that multi-components quantification of the alcohol precipitation liquid of Codonopsis Radix could be achieved with an NIRS-based method, which offers a useful tool for real-time release testing (RTRT) of intermediates in the manufacture of Codonopsis Radix.

  1. Simultaneous accelerated solvent extraction and hydrolysis of 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid glucuronide in meconium samples for gas chromatography-mass spectrometry analysis.

    PubMed

    Mantovani, Cinthia de Carvalho; Silva, Jefferson Pereira E; Forster, Guilherme; Almeida, Rafael Menck de; Diniz, Edna Maria de Albuquerque; Yonamine, Mauricio

    2018-02-01

    Cannabis misuse during pregnancy is associated with severe impacts on the mother and baby health, such as newborn low birth weight, growth restriction, pre-term birth, neurobehavioral and developmental deficits. In most of the cases, drug abuse is omitted or denied by the mothers. Thus, toxicological analyzes using maternal-fetal matrices takes place as a suitable tool to assess drug use. Herein, meconium was the chosen matrix to evaluate cannabis exposure through identification and quantification of 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic (THCCOOH). Accelerated solvent extraction (ASE) was applied for sample preparation technique to simultaneously extract and hydrolyze conjugated THCCOOH from meconium, followed by a solid-phase extraction (SPE) procedure. The method was developed and validated for gas chromatography-mass spectrometry (GC-MS), reaching hydrolysis efficiency of 98%. Limits of detection (LOD) and quantification (LOQ) were, respectively, 5 and 10 ng/g. The range of linearity was LOQ to 500 ng/g. Inter and intra-batch coefficients of variation were <8.4% for all concentration levels. Accuracy was in 101.7-108.9% range. Recovery was on average 60.3%. Carryover effect was not observed. The procedure was applied in six meconium samples from babies whose mothers were drug users and showed satisfactory performance to confirm fetal cannabis exposure. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Nuclear magnetic resonance and high-performance liquid chromatography techniques for the characterization of bioactive compounds from Humulus lupulus L. (hop).

    PubMed

    Bertelli, Davide; Brighenti, Virginia; Marchetti, Lucia; Reik, Anna; Pellati, Federica

    2018-06-01

    Humulus lupulus L. (hop) represents one of the most cultivated crops, it being a key ingredient in the brewing process. Many health-related properties have been described for hop extracts, making this plant gain more interest in the field of pharmaceutical and nutraceutical research. Among the analytical tools available for the phytochemical characterization of plant extracts, quantitative nuclear magnetic resonance (qNMR) represents a new and powerful technique. In this ambit, the present study was aimed at the development of a new, simple, and efficient qNMR method for the metabolite fingerprinting of bioactive compounds in hop cones, taking advantage of the novel ERETIC 2 tool. To the best of our knowledge, this is the first attempt to apply this method to complex matrices of natural origin, such as hop extracts. The qNMR method set up in this study was applied to the quantification of both prenylflavonoids and bitter acids in eight hop cultivars. The performance of this analytical method was compared with that of HPLC-UV/DAD, which represents the most frequently used technique in the field of natural product analysis. The quantitative data obtained for hop samples by means of the two aforementioned techniques highlighted that the amount of bioactive compounds was slightly higher when qNMR was applied, although the order of magnitude of the values was the same. The accuracy of qNMR was comparable to that of the chromatographic method, thus proving to be a reliable tool for the analysis of these secondary metabolites in hop extracts. Graphical abstract Graphical abstract related to the extraction and analytical methods applied in this work for the analysis of bioactive compounds in Humulus lupulus L. (hop) cones.

  3. [Detection of recombinant-DNA in foods from stacked genetically modified plants].

    PubMed

    Sorokina, E Iu; Chernyshova, O N

    2012-01-01

    A quantitative real-time multiplex polymerase chain reaction method was applied to the detection and quantification of MON863 and MON810 in stacked genetically modified maize MON 810xMON 863. The limit of detection was approximately 0,1%. The accuracy of the quantification, measured as bias from the accepted value and the relative repeatability standard deviation, which measures the intra-laboratory variability, were within 25% at each GM-level. A method verification has demonstrated that the MON 863 and the MON810 methods can be equally applied in quantification of the respective events in stacked MON810xMON 863.

  4. Nucleic acids-based tools for ballast water surveillance, monitoring, and research

    NASA Astrophysics Data System (ADS)

    Darling, John A.; Frederick, Raymond M.

    2018-03-01

    Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size classes), gaps in nucleic acids reference databases are still considerable, uncertainties in taxonomic assignment methods persist, and many applications have not yet matured sufficiently to offer standardized methods capable of meeting rigorous quality assurance standards. Nevertheless, the potential value of these tools, their growing utilization in biodiversity monitoring, and the rapid methodological advances over the past decade all suggest that they should be seriously considered for inclusion in the ballast water surveillance toolkit.

  5. CEQer: A Graphical Tool for Copy Number and Allelic Imbalance Detection from Whole-Exome Sequencing Data

    PubMed Central

    Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo

    2013-01-01

    Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data. PMID:24124457

  6. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  7. Added clinical value of applying myocardial deformation imaging to assess right ventricular function.

    PubMed

    Sokalskis, Vladislavs; Peluso, Diletta; Jagodzinski, Annika; Sinning, Christoph

    2017-06-01

    Right heart dysfunction has been found to be a strong prognostic factor predicting adverse outcome in various cardiopulmonary diseases. Conventional echocardiographic measurements can be limited by geometrical assumptions and impaired reproducibility. Speckle tracking-derived strain provides a robust quantification of right ventricular function. It explicitly evaluates myocardial deformation, as opposed to tissue Doppler-derived strain, which is computed from tissue velocity gradients. Right ventricular longitudinal strain provides a sensitive tool for detecting right ventricular dysfunction, even at subclinical levels. Moreover, the longitudinal strain can be applied for prognostic stratification of patients with pulmonary hypertension, pulmonary embolism, and congestive heart failure. Speckle tracking-derived right atrial strain, right ventricular longitudinal strain-derived mechanical dyssynchrony, and three-dimensional echocardiography-derived strain are emerging imaging parameters and methods. Their application in research is paving the way for their clinical use. © 2017, Wiley Periodicals, Inc.

  8. Automated quantification of the synchrogram by recurrence plot analysis.

    PubMed

    Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart

    2012-04-01

    Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.

  9. Overview of the SAMSI year-long program on Statistical, Mathematical and Computational Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Jogesh Babu, G.

    2017-01-01

    A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.

  10. Recurrence quantification analysis of electrically evoked surface EMG signal.

    PubMed

    Liu, Chunling; Wang, Xu

    2005-01-01

    Recurrence Plot is a quite useful tool used in time-series analysis, in particular for measuring unstable periodic orbits embedded in a chaotic dynamical system. This paper introduced the structures of the Recurrence Plot and the ways of the plot coming into being. Then the way of the quantification of the Recurrence Plot is defined. In this paper, one of the possible applications of Recurrence Quantification Analysis (RQA) strategy to the analysis of electrical stimulation evoked surface EMG. The result shows the percent determination is increased along with stimulation intensity.

  11. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  12. Matthew Reynolds | NREL

    Science.gov Websites

    food science. Matthew's research at NREL is focused on applying uncertainty quantification techniques . Research Interests Uncertainty quantification Computational multilinear algebra Approximation theory of and the Canonical Tensor Decomposition, Journal of Computational Physics (2017) Randomized Alternating

  13. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  14. Accuracy Quantification of the Loci-CHEM Code for Chamber Wall Heat Transfer in a GO2/GH2 Single Element Injector Model Problem

    NASA Technical Reports Server (NTRS)

    West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin

    2006-01-01

    A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci-CHEM CFD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid was used and then locally refined to demonstrate grid convergence. Solutions were obtained with three variations of the k-omega turbulence model.

  15. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  16. Accuracy Quantification of the Loci-CHEM Code for Chamber Wall Heat Fluxes in a G02/GH2 Single Element Injector Model Problem

    NASA Technical Reports Server (NTRS)

    West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin

    2006-01-01

    A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci- CHEM CPD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid grid was used and then locally refined to demonstrate grid convergence. Solutions were also obtained with three variations of the k-omega turbulence model.

  17. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    NASA Astrophysics Data System (ADS)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for inversion. The worth of different types of tephra data to reduce parameter uncertainty is evaluated, as is the importance of different observation error models. The analyses reveal the importance using tephra granulometry data for inversion, which results in reduced uncertainty for most eruption parameters. In the third chapter, geophysical inversion is combined with hydrothermal modeling to evaluate the enthalpy of an undeveloped geothermal resource in a pull-apart basin located in southeastern Armenia. A high-dimensional gravity inversion is used to define the depth to the contact between the lower-density valley fill sediments and the higher-density surrounding host rock. The inverted basin depth distribution was used to define the hydrostratigraphy for the coupled groundwater-flow and heat-transport model that simulates the circulation of hydrothermal fluids in the system. Evaluation of several different geothermal system configurations indicates that the most likely system configuration is a low-enthalpy, liquid-dominated geothermal system.

  18. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  19. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    PubMed

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  20. Direct quantification of lipopeptide biosurfactants in biological samples via HPLC and UPLC-MS requires sample modification with an organic solvent.

    PubMed

    Biniarz, Piotr; Łukaszewicz, Marcin

    2017-06-01

    The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.

  1. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  2. Quantification of viable bacterial starter cultures of Virgibacillus sp. and Tetragenococcus halophilus in fish sauce fermentation by real-time quantitative PCR.

    PubMed

    Udomsil, Natteewan; Chen, Shu; Rodtong, Sureelak; Yongsawatdigul, Jirawat

    2016-08-01

    Real-time quantitative polymerase chain reaction (qPCR) methods were developed for the quantification of Virgibacillus sp. SK37 and Tetragenococcus halophilus MS33, which were added as starter cultures in fish sauce fermentation. The PCR assays were coupled with propidium monoazide (PMA) treatment of samples to selectively quantify viable cells and integrated with exogenous recombinant Escherichia coli cells to control variabilities in analysis procedures. The qPCR methods showed species-specificity for both Virgibacillus halodenitrificans and T. halophilus as evaluated using 6 reference strains and 28 strains of bacteria isolated from fish sauce fermentation. The qPCR efficiencies were 101.1% for V. halodenitrificans and 90.2% for T. halophilus. The quantification limits of the assays were 10(3) CFU/mL and 10(2) CFU/mL in fish sauce samples with linear correlations over 4 Logs for V. halodenitrificans and T. halophilus, respectively. The matrix effect was not observed when evaluated using fish sauce samples fermented for 1-6 months. The developed PMA-qPCR methods were successfully applied to monitor changes of Virgibacillus sp. SK37 and T. halophilus MS33 in a mackerel fish sauce fermentation model where culture-dependent techniques failed to quantify the starter cultures. The results demonstrated the usability of the methods as practical tools for monitoring the starter cultures in fish sauce fermentation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  4. Integrated analyses of proteins and their glycans in a magnetic bead-based multiplex assay format.

    PubMed

    Li, Danni; Chiu, Hanching; Chen, Jing; Zhang, Hui; Chan, Daniel W

    2013-01-01

    Well-annotated clinical samples are valuable resources for biomarker discovery and validation. Multiplex and integrated methods that simultaneously measure multiple analytes and generate integrated information about these analytes from a single measurement are desirable because these methods help conserve precious samples. We developed a magnetic bead-based system for multiplex and integrated glycoprotein quantification by immunoassays and glycan detection by lectin immunosorbent assays (LISAs). Magnetic beads coupled with antibodies were used for capturing proteins of interest. Biotinylated antibodies in combination with streptavidin-labeled phycoerythrin were used for protein quantification. In the LISAs, biotinylated detection antibodies were replaced by biotinylated lectins for glycan detection. Using tissue inhibitor of metallopeptidase 1 (TIMP-1), tissue plasminogen activator, membrane metallo-endopeptidase, and dipeptidyl peptidase-IV (DPP-4) as models, we found that the multiplex integrated system was comparable to single immunoassays in protein quantification and LISAs in glycan detection. The merits of this system were demonstrated when applied to well-annotated prostate cancer tissues for validation of biomarkers in aggressive prostate cancer. Because of the system's multiplex ability, we used only 300 ng of tissue protein for the integrated detection of glycans in these proteins. Fucosylated TIMP-1 and DPP-4 offered improved performance over the proteins in distinguishing aggressive and nonaggressive prostate cancer. The multiplex and integrated system conserves samples and is a useful tool for validation of glycoproteins and their glycoforms as biomarkers. © 2012 American Association for Clinical Chemistry

  5. Dynamical characteristics of surface EMG signals of hand grasps via recurrence plot.

    PubMed

    Ouyang, Gaoxiang; Zhu, Xiangyang; Ju, Zhaojie; Liu, Honghai

    2014-01-01

    Recognizing human hand grasp movements through surface electromyogram (sEMG) is a challenging task. In this paper, we investigated nonlinear measures based on recurrence plot, as a tool to evaluate the hidden dynamical characteristics of sEMG during four different hand movements. A series of experimental tests in this study show that the dynamical characteristics of sEMG data with recurrence quantification analysis (RQA) can distinguish different hand grasp movements. Meanwhile, adaptive neuro-fuzzy inference system (ANFIS) is applied to evaluate the performance of the aforementioned measures to identify the grasp movements. The experimental results show that the recognition rate (99.1%) based on the combination of linear and nonlinear measures is much higher than those with only linear measures (93.4%) or nonlinear measures (88.1%). These results suggest that the RQA measures might be a potential tool to reveal the sEMG hidden characteristics of hand grasp movements and an effective supplement for the traditional linear grasp recognition methods.

  6. A novel qPCR protocol for the specific detection and quantification of the fuel-deteriorating fungus Hormoconis resinae.

    PubMed

    Martin-Sanchez, Pedro M; Gorbushina, Anna A; Kunte, Hans-Jörg; Toepel, Jörg

    2016-07-01

    A wide variety of fungi and bacteria are known to contaminate fuels and fuel systems. These microbial contaminants have been linked to fuel system fouling and corrosion. The fungus Hormoconis resinae, a common jet fuel contaminant, is used in this study as a model for developing innovative risk assessment methods. A novel qPCR protocol to detect and quantify H. resinae in, and together with, total fungal contamination of fuel systems is reported. Two primer sets, targeting the markers RPB2 and ITS, were selected for their remarkable specificity and sensitivity. These primers were successfully applied on fungal cultures and diesel samples demonstrating the validity and reliability of the established qPCR protocol. This novel tool allows clarification of the current role of H. resinae in fuel contamination cases, as well as providing a technique to detect fungal outbreaks in fuel systems. This tool can be expanded to other well-known fuel-deteriorating microorganisms.

  7. Uncertainty Quantification applied to flow simulations in thoracic aortic aneurysms

    NASA Astrophysics Data System (ADS)

    Boccadifuoco, Alessandro; Mariotti, Alessandro; Celi, Simona; Martini, Nicola; Salvetti, Maria Vittoria

    2015-11-01

    The thoracic aortic aneurysm is a progressive dilatation of the thoracic aorta causing a weakness in the aortic wall, which may eventually cause life-threatening events. Clinical decisions on treatment strategies are currently based on empiric criteria, like the aortic diameter value or its growth rate. Numerical simulations can give the quantification of important indexes which are impossible to be obtained through in-vivo measurements and can provide supplementary information. Hemodynamic simulations are carried out by using the open-source tool SimVascular and considering patient-specific geometries. One of the main issues in these simulations is the choice of suitable boundary conditions, modeling the organs and vessels not included in the computational domain. The current practice is to use outflow conditions based on resistance and capacitance, whose values are tuned to obtain a physiological behavior of the patient pressure. However it is not known a priori how this choice affects the results of the simulation. The impact of the uncertainties in these outflow parameters is investigated here by using the generalized Polynomial Chaos approach. This analysis also permits to calibrate the outflow-boundary parameters when patient-specific in-vivo data are available.

  8. Development of a matrix-assisted laser desorption ionization mass spectrometric method for rapid process-monitoring of phthalocyanine compounds.

    PubMed

    Chen, Yi-Ting; Wang, Fu-Shing; Li, Zhendong; Li, Liang; Ling, Yong-Chien

    2012-07-29

    Phthalocyanines (PCs), an important class of chemicals widely used in many industrial sectors, are macrocyclic compounds possessing a heteroaromatic π-electron system with optical properties influenced by chemical structures and impurities or by-products introduced during the synthesis process. Analytical tools allowing for rapid monitoring of the synthesis processes are of significance for the development of new PCs with improved performance in many application areas. In this work, we report a matrix-assisted laser desorption/ionization (MALDI) time-of-flight mass spectrometry (TOFMS) method for rapid and convenient monitoring of PC synthesis reactions. For this class of compounds, intact molecular ions could be detected by MALDI using retinoic acid as matrix. It was shown that relative quantification results of two PC compounds could be generated by MALDI MS. This method was applied to monitor the bromination reactions of nickel- and copper-containing PCs. It was demonstrated that, compared to the traditional UV-visible method, the MALDI MS method offers the advantage of higher sensitivity while providing chemical species and relative quantification information on the reactants and products, which are crucial to process monitoring. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Extended generalized recurrence plot quantification of complex circular patterns

    NASA Astrophysics Data System (ADS)

    Riedl, Maik; Marwan, Norbert; Kurths, Jürgen

    2017-03-01

    The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing patterns, turbulent spatial plankton patterns, and fractals. Determinism is a central measure in this framework quantifying the level of regularity of spatial structures. We show by basic examples of fully regular patterns of different symmetries that this measure underestimates the orderliness of circular patterns resulting from rotational symmetries. We overcome this crucial problem by checking additional structural elements of the generalized recurrence plot which is demonstrated with the examples. Furthermore, we show the potential of the extended quantity of determinism applying it to more irregular circular patterns which are generated by the complex Ginzburg-Landau-equation and which can be often observed in real spatially extended dynamical systems. So, we are able to reconstruct the main separations of the system's parameter space analyzing single snapshots of the real part only, in contrast to the use of the original quantity. This ability of the proposed method promises also an improved description of other systems with complicated spatio-temporal dynamics typically occurring in fluid dynamics, climatology, biology, ecology, social sciences, etc.

  10. Comparative and Quantitative Global Proteomics Approaches: An Overview

    PubMed Central

    Deracinois, Barbara; Flahaut, Christophe; Duban-Deweer, Sophie; Karamanos, Yannis

    2013-01-01

    Proteomics became a key tool for the study of biological systems. The comparison between two different physiological states allows unravelling the cellular and molecular mechanisms involved in a biological process. Proteomics can confirm the presence of proteins suggested by their mRNA content and provides a direct measure of the quantity present in a cell. Global and targeted proteomics strategies can be applied. Targeted proteomics strategies limit the number of features that will be monitored and then optimise the methods to obtain the highest sensitivity and throughput for a huge amount of samples. The advantage of global proteomics strategies is that no hypothesis is required, other than a measurable difference in one or more protein species between the samples. Global proteomics methods attempt to separate quantify and identify all the proteins from a given sample. This review highlights only the different techniques of separation and quantification of proteins and peptides, in view of a comparative and quantitative global proteomics analysis. The in-gel and off-gel quantification of proteins will be discussed as well as the corresponding mass spectrometry technology. The overview is focused on the widespread techniques while keeping in mind that each approach is modular and often recovers the other. PMID:28250403

  11. A fully automated microfluidic-based electrochemical sensor for real-time bacteria detection.

    PubMed

    Altintas, Zeynep; Akgun, Mete; Kokturk, Guzin; Uludag, Yildiz

    2018-02-15

    A fully automated microfluidic-based electrochemical biosensor was designed and manufactured for pathogen detection. The quantification of Escherichia coli was investigated with standard and nanomaterial amplified immunoassays in the concentration ranges of 0.99 × 10 4 3.98 × 10 9 cfu mL -1 and 103.97 × 10 7 cfu mL -1 which resulted in detection limits of 1.99 × 10 4 cfu mL -1 and 50 cfu mL -1 , respectively. The developed methodology was then applied for E. coli quantification in water samples using nanomaterial modified assay. Same detection limit for E. coli was achieved for real sample analysis with a little decrease on the sensor signal. Cross-reactivity studies were conducted by testing Shigella, Salmonella spp., Salmonella typhimurium and Staphylococcus aureus on E. coli specific antibody surface that confirmed the high specificity of the developed immunoassays. The sensor surface could be regenerated multiple times which significantly reduces the cost of the system. Our custom-designed biosensor is capable of detecting bacteria with high sensitivity and specificity, and can serve as a promising tool for pathogen detection. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Coleman, Kayla; Hooper, Russell W.

    2016-10-04

    In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers. More specifically, the CASL VUQ Strategy [33] prescribes the use of Predictive Capability Maturity Model (PCMM) assessments [37]. PCMM is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated with an intended application. Exercising a computational model with the methodsmore » in Dakota will yield, in part, evidence for a predictive capability maturity model (PCMM) assessment. Table 1.1 summarizes some key predictive maturity related activities (see details in [33]), with examples of how Dakota fits in. This manual offers CASL partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and how to apply Dakota to a simulation problem.« less

  13. Tube-Forming Assays.

    PubMed

    Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy

    2016-01-01

    Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.

  14. Reduction of a linear complex model for respiratory system during Airflow Interruption.

    PubMed

    Jablonski, Ireneusz; Mroczka, Janusz

    2010-01-01

    The paper presents methodology of a complex model reduction to its simpler version - an identifiable inverse model. Its main tool is a numerical procedure of sensitivity analysis (structural and parametric) applied to the forward linear equivalent designed for the conditions of interrupter experiment. Final result - the reduced analog for the interrupter technique is especially worth of notice as it fills a major gap in occlusional measurements, which typically use simple, one- or two-element physical representations. Proposed electrical reduced circuit, being structural combination of resistive, inertial and elastic properties, can be perceived as a candidate for reliable reconstruction and quantification (in the time and frequency domain) of dynamical behavior of the respiratory system in response to a quasi-step excitation by valve closure.

  15. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  16. Aviation Environmental Design Tool (AEDT) : Uncertainty Quantification Supplemental Report : Version 2a Service Pack 2 (SP2)

    DOT National Transportation Integrated Search

    2014-05-01

    The Federal Aviation Administration, Office of Environment and Energy (FAA-AEE) has developed the Aviation Environmental Design Tool (AEDT) version 2a software system with the support of the following development team: FAA, National Aeronautics and S...

  17. ShapeRotator: An R tool for standardized rigid rotations of articulated three-dimensional structures with application for geometric morphometrics.

    PubMed

    Vidal-García, Marta; Bandara, Lashi; Keogh, J Scott

    2018-05-01

    The quantification of complex morphological patterns typically involves comprehensive shape and size analyses, usually obtained by gathering morphological data from all the structures that capture the phenotypic diversity of an organism or object. Articulated structures are a critical component of overall phenotypic diversity, but data gathered from these structures are difficult to incorporate into modern analyses because of the complexities associated with jointly quantifying 3D shape in multiple structures. While there are existing methods for analyzing shape variation in articulated structures in two-dimensional (2D) space, these methods do not work in 3D, a rapidly growing area of capability and research. Here, we describe a simple geometric rigid rotation approach that removes the effect of random translation and rotation, enabling the morphological analysis of 3D articulated structures. Our method is based on Cartesian coordinates in 3D space, so it can be applied to any morphometric problem that also uses 3D coordinates (e.g., spherical harmonics). We demonstrate the method by applying it to a landmark-based dataset for analyzing shape variation using geometric morphometrics. We have developed an R tool (ShapeRotator) so that the method can be easily implemented in the commonly used R package geomorph and MorphoJ software. This method will be a valuable tool for 3D morphological analyses in articulated structures by allowing an exhaustive examination of shape and size diversity.

  18. Identification of spectral regions for the quantification of red wine tannins with fourier transform mid-infrared spectroscopy.

    PubMed

    Jensen, Jacob S; Egebo, Max; Meyer, Anne S

    2008-05-28

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.

  19. MaxReport: An Enhanced Proteomic Result Reporting Tool for MaxQuant.

    PubMed

    Zhou, Tao; Li, Chuyu; Zhao, Wene; Wang, Xinru; Wang, Fuqiang; Sha, Jiahao

    2016-01-01

    MaxQuant is a proteomic software widely used for large-scale tandem mass spectrometry data. We have designed and developed an enhanced result reporting tool for MaxQuant, named as MaxReport. This tool can optimize the results of MaxQuant and provide additional functions for result interpretation. MaxReport can generate report tables for protein N-terminal modifications. It also supports isobaric labelling based relative quantification at the protein, peptide or site level. To obtain an overview of the results, MaxReport performs general descriptive statistical analyses for both identification and quantification results. The output results of MaxReport are well organized and therefore helpful for proteomic users to better understand and share their data. The script of MaxReport, which is freely available at http://websdoor.net/bioinfo/maxreport/, is developed using Python code and is compatible across multiple systems including Windows and Linux.

  20. Cell Deformation by Single-beam Acoustic Trapping: A Promising Tool for Measurements of Cell Mechanics

    PubMed Central

    Hwang, Jae Youn; Kim, Jihun; Park, Jin Man; Lee, Changyang; Jung, Hayong; Lee, Jungwoo; Shung, K. Kirk

    2016-01-01

    We demonstrate a noncontact single-beam acoustic trapping method for the quantification of the mechanical properties of a single suspended cell with label-free. Experimentally results show that the single-beam acoustic trapping force results in morphological deformation of a trapped cell. While a cancer cell was trapped in an acoustic beam focus, the morphological changes of the immobilized cell were monitored using bright-field imaging. The cell deformability was then compared with that of a trapped polystyrene microbead as a function of the applied acoustic pressure for a better understanding of the relationship between the pressure and degree of cell deformation. Cell deformation was found to become more pronounced as higher pressure levels were applied. Furthermore, to determine if this acoustic trapping method can be exploited in quantifying the cell mechanics in a suspension and in a non-contact manner, the deformability levels of breast cancer cells with different degrees of invasiveness due to acoustic trapping were compared. It was found that highly-invasive breast cancer cells exhibited greater deformability than weakly-invasive breast cancer cells. These results clearly demonstrate that the single-beam acoustic trapping technique is a promising tool for non-contact quantitative assessments of the mechanical properties of single cells in suspensions with label-free. PMID:27273365

  1. Temporal Processing of Dynamic Positron Emission Tomography via Principal Component Analysis in the Sinogram Domain

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Parker, B. J.; Feng, D. D.; Fulton, R.

    2004-10-01

    In this paper, we compare various temporal analysis schemes applied to dynamic PET for improved quantification, image quality and temporal compression purposes. We compare an optimal sampling schedule (OSS) design, principal component analysis (PCA) applied in the image domain, and principal component analysis applied in the sinogram domain; for region-of-interest quantification, sinogram-domain PCA is combined with the Huesman algorithm to quantify from the sinograms directly without requiring reconstruction of all PCA channels. Using a simulated phantom FDG brain study and three clinical studies, we evaluate the fidelity of the compressed data for estimation of local cerebral metabolic rate of glucose by a four-compartment model. Our results show that using a noise-normalized PCA in the sinogram domain gives similar compression ratio and quantitative accuracy to OSS, but with substantially better precision. These results indicate that sinogram-domain PCA for dynamic PET can be a useful preprocessing stage for PET compression and quantification applications.

  2. MR/PET quantification tools: Registration, segmentation, classification, and MR-based attenuation correction

    PubMed Central

    Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.

    2012-01-01

    Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET. PMID:23039679

  3. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Treesearch

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Part, Florian; Zecha, Gudrun; Causon, Tim

    Highlights: • First review on detection of nanomaterials in complex waste samples. • Focus on nanoparticles in solid, liquid and gaseous waste samples. • Summary of current applicable methods for nanowaste detection and characterisation. • Limitations and challenges of characterisation of nanoparticles in waste. - Abstract: Engineered nanomaterials (ENMs) are already extensively used in diverse consumer products. Along the life cycle of a nano-enabled product, ENMs can be released and subsequently accumulate in the environment. Material flow models also indicate that a variety of ENMs may accumulate in waste streams. Therefore, a new type of waste, so-called nanowaste, is generatedmore » when end-of-life ENMs and nano-enabled products are disposed of. In terms of the precautionary principle, environmental monitoring of end-of-life ENMs is crucial to allow assessment of the potential impact of nanowaste on our ecosystem. Trace analysis and quantification of nanoparticulate species is very challenging because of the variety of ENM types that are used in products and low concentrations of nanowaste expected in complex environmental media. In the framework of this paper, challenges in nanowaste characterisation and appropriate analytical techniques which can be applied to nanowaste analysis are summarised. Recent case studies focussing on the characterisation of ENMs in waste streams are discussed. Most studies aim to investigate the fate of nanowaste during incineration, particularly considering aerosol measurements; whereas, detailed studies focusing on the potential release of nanowaste during waste recycling processes are currently not available. In terms of suitable analytical methods, separation techniques coupled to spectrometry-based methods are promising tools to detect nanowaste and determine particle size distribution in liquid waste samples. Standardised leaching protocols can be applied to generate soluble fractions stemming from solid wastes, while micro- and ultrafiltration can be used to enrich nanoparticulate species. Imaging techniques combined with X-ray-based methods are powerful tools for determining particle size, morphology and screening elemental composition. However, quantification of nanowaste is currently hampered due to the problem to differentiate engineered from naturally-occurring nanoparticles. A promising approach to face these challenges in nanowaste characterisation might be the application of nanotracers with unique optical properties, elemental or isotopic fingerprints. At present, there is also a need to develop and standardise analytical protocols regarding nanowaste sampling, separation and quantification. In general, more experimental studies are needed to examine the fate and transport of ENMs in waste streams and to deduce transfer coefficients, respectively to develop reliable material flow models.« less

  5. A python framework for environmental model uncertainty analysis

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  6. Relative quantification of N(epsilon)-(Carboxymethyl)lysine, imidazolone A, and the Amadori product in glycated lysozyme by MALDI-TOF mass spectrometry.

    PubMed

    Kislinger, Thomas; Humeny, Andreas; Peich, Carlo C; Zhang, Xiaohong; Niwa, Toshimitsu; Pischetsrieder, Monika; Becker, Cord-Michael

    2003-01-01

    The nonenzymatic glycation of proteins by reducing sugars, also known as the Maillard reaction, has received increasing recognition from nutritional science and medical research. In this study, we applied matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) to perform relative and simultaneous quantification of the Amadori product, which is an early glycation product, and of N(epsilon)-(carboxymethyl)lysine and imidazolone A, two important advanced glycation end products. Therefore, native lysozyme was incubated with d-glucose for increasing periods of time (1, 4, 8, and 16 weeks) in phosphate-buffered saline pH 7.8 at 50 degrees C. After enzymatic digestion with endoproteinase Glu-C, the N-terminal peptide fragment (m/z 838; amino acid sequence KVFGRCE) and the C-terminal peptide fragment (m/z 1202; amino acid sequence VQAWIRGCRL) were used for relative quantification of the three Maillard products. Amadori product, N(epsilon)-(carboxymethyl)lysine, and imidazolone A were the main glycation products formed under these conditions. Their formation was dependent on glucose concentration and reaction time. The kinetics were similar to those obtained by competitive ELISA, an established method for quantification of N(epsilon)-(carboxymethyl)lysine and imidazolone A. Inhibition experiments showed that coincubation with N(alpha)-acetylargine suppressed formation of imidazolone A but not of the Amadori product or N(epsilon)-(carboxymethyl)lysine. The presence of N(alpha)-acetyllysine resulted in the inhibition of lysine modifications but in higher concentrations of imidazolone A. o-Phenylenediamine decreased the yield of the Amadori product and completely inhibited the formation of N(epsilon)-(carboxymethyl)lysine and imidazolone A. MALDI-TOF-MS proved to be a new analytical tool for the simultaneous, relative quantification of specific products of the Maillard reaction. For the first time, kinetic data of defined products on specific sites of glycated protein could be measured. This characterizes MALDI-TOF-MS as a valuable method for monitoring the Maillard reaction in the course of food processing.

  7. Data Independent Acquisition analysis in ProHits 4.0.

    PubMed

    Liu, Guomin; Knight, James D R; Zhang, Jian Ping; Tsou, Chih-Chiang; Wang, Jian; Lambert, Jean-Philippe; Larsen, Brett; Tyers, Mike; Raught, Brian; Bandeira, Nuno; Nesvizhskii, Alexey I; Choi, Hyungwon; Gingras, Anne-Claude

    2016-10-21

    Affinity purification coupled with mass spectrometry (AP-MS) is a powerful technique for the identification and quantification of physical interactions. AP-MS requires careful experimental design, appropriate control selection and quantitative workflows to successfully identify bona fide interactors amongst a large background of contaminants. We previously introduced ProHits, a Laboratory Information Management System for interaction proteomics, which tracks all samples in a mass spectrometry facility, initiates database searches and provides visualization tools for spectral counting-based AP-MS approaches. More recently, we implemented Significance Analysis of INTeractome (SAINT) within ProHits to provide scoring of interactions based on spectral counts. Here, we provide an update to ProHits to support Data Independent Acquisition (DIA) with identification software (DIA-Umpire and MSPLIT-DIA), quantification tools (through DIA-Umpire, or externally via targeted extraction), and assessment of quantitative enrichment (through mapDIA) and scoring of interactions (through SAINT-intensity). With additional improvements, notably support of the iProphet pipeline, facilitated deposition into ProteomeXchange repositories and enhanced export and viewing functions, ProHits 4.0 offers a comprehensive suite of tools to facilitate affinity proteomics studies. It remains challenging to score, annotate and analyze proteomics data in a transparent manner. ProHits was previously introduced as a LIMS to enable storing, tracking and analysis of standard AP-MS data. In this revised version, we expand ProHits to include integration with a number of identification and quantification tools based on Data-Independent Acquisition (DIA). ProHits 4.0 also facilitates data deposition into public repositories, and the transfer of data to new visualization tools. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  9. Automating pattern quantification: new tools for analysing anisotropy and inhomogeneity of 2d patterns

    NASA Astrophysics Data System (ADS)

    Gerik, A.; Kruhl, J. H.

    2006-12-01

    The quantitative analysis of patterns as a geometric arrangement of material domains with specific geometric or crystallographic properties such as shape, size or crystallographic orientation has been shown to be a valuable tool with a wide field of applications in geo- and material sciences. Pattern quantification allows an unbiased comparison of experimentally generated or theoretical patterns with patterns of natural origin. In addition to this, the application of different methods can also provide information about different pattern forming processes. This information includes the distribution of crystals in a matrix - to analyze i.e. the nature and orientation of flow within a melt - or the governing shear strain regime at the point of time the pattern was formed as well as nature of fracture patterns of different scales, all of which are of great interest not only in structural and engineering geology, but also in material sciences. Different approaches to this problem have been discussed over the past fifteen years, yet only few of the methods were applied successfully at least to single examples (i.e. Velde et al., 1990; Harris et al., 1991; Peternell et al., 2003; Volland &Kruhl, 2004). One of the reasons for this has been the high expenditure of time that was necessary to prepare and analyse the samples. To overcome this problem, a first selection of promising methods have been implemented into a growing collection of software tools: (1) The modifications that Harris et al. (1991) have suggested for the Cantor's dust method (Velde et al., 1990) and which have been applied by Volland &Kruhl (2004) to show the anisotropy in a breccia sample. (2) A map-counting method that uses local box-counting dimensions to map the inhomogeneity of a crystal distribution pattern. Peternell et al. (2003) have used this method to analyze the distribution of phenocrysts in a porphyric granite. (3) A modified perimeter method that relates the directional dependence of the perimeter of grain boundaries to the anisotropy of the pattern (Peternell et al., 2003). We have used the resulting new possibilities to analyze numerous patterns of natural, experimental and mathematical origin in order to determine the scope of applicability of the different methods and present these results along with an evaluation of their individual sensitivities and limitations. References: Harris, C., Franssen, R. &Loosveld, R. (1991): Fractal analysis of fractures in rocks: the Cantor's Dust method comment. Tectonophysics 198: 107-111. Peternell, M., Andries, F. &Kruhl, J.H. (2003): Magmatic flow-pattern anisotropies - analyzed on the basis of a new 'map-mounting' fractal geometry method. DRT Tectonics conference, St. Malo, Book of Abstracts. Velde, B., Dubois, J., Touchard, G. &Badri, A. (1990): Fractal analysis of fractures in rocks: the Cantor's Dust method. Tectonophysics (179): 345-352. Volland, S. &Kruhl, J.H. (2004): Anisotropy quantification: the application of fractal geometry methods on tectonic fracture patterns of a Hercynian fault zone in NW-Sardinia. Journal of Structural Geology 26: 1499- 1510.

  10. Learning and Information Approaches for Inference in Dynamic Data-Driven Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Ravela, S.

    2015-12-01

    Many Geophysical inference problems are characterized by non-linear processes, high-dimensional models and complex uncertainties. A dynamic coupling between models, estimation, and sampling is typically sought to efficiently characterize and reduce uncertainty. This process is however fraught with several difficulties. Among them, the key difficulties are the ability to deal with model errors, efficacy of uncertainty quantification and data assimilation. In this presentation, we present three key ideas from learning and intelligent systems theory and apply them to two geophysical applications. The first idea is the use of Ensemble Learning to compensate for model error, the second is to develop tractable Information Theoretic Learning to deal with non-Gaussianity in inference, and the third is a Manifold Resampling technique for effective uncertainty quantification. We apply these methods, first to the development of a cooperative autonomous observing system using sUAS for studying coherent structures. We apply this to Second, we apply this to the problem of quantifying risk from hurricanes and storm surges in a changing climate. Results indicate that learning approaches can enable new effectiveness in cases where standard approaches to model reduction, uncertainty quantification and data assimilation fail.

  11. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    PubMed

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fang; Liu, Tao; Qian, Weijun

    2011-07-22

    Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.

  13. Quantification and propagation of disciplinary uncertainty via Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Mantis, George Constantine

    2002-08-01

    Several needs exist in the military, commercial, and civil sectors for new hypersonic systems. These needs remain unfulfilled, due in part to the uncertainty encountered in designing these systems. This uncertainty takes a number of forms, including disciplinary uncertainty, that which is inherent in the analytical tools utilized during the design process. Yet, few efforts to date empower the designer with the means to account for this uncertainty within the disciplinary analyses. In the current state-of-the-art in design, the effects of this unquantifiable uncertainty significantly increase the risks associated with new design efforts. Typically, the risk proves too great to allow a given design to proceed beyond the conceptual stage. To that end, the research encompasses the formulation and validation of a new design method, a systematic process for probabilistically assessing the impact of disciplinary uncertainty. The method implements Bayesian Statistics theory to quantify this source of uncertainty, and propagate its effects to the vehicle system level. Comparison of analytical and physical data for existing systems, modeled a priori in the given analysis tools, leads to quantification of uncertainty in those tools' calculation of discipline-level metrics. Then, after exploration of the new vehicle's design space, the quantified uncertainty is propagated probabilistically through the design space. This ultimately results in the assessment of the impact of disciplinary uncertainty on the confidence in the design solution: the final shape and variability of the probability functions defining the vehicle's system-level metrics. Although motivated by the hypersonic regime, the proposed treatment of uncertainty applies to any class of aerospace vehicle, just as the problem itself affects the design process of any vehicle. A number of computer programs comprise the environment constructed for the implementation of this work. Application to a single-stage-to-orbit (SSTO) reusable launch vehicle concept, developed by the NASA Langley Research Center under the Space Launch Initiative, provides the validation case for this work, with the focus placed on economics, aerothermodynamics, propulsion, and structures metrics. (Abstract shortened by UMI.)

  14. From gross anatomy to the nanomorphome: stereological tools provide a paradigm for advancing research in quantitative morphomics

    PubMed Central

    Mayhew, Terry M; Lucocq, John M

    2015-01-01

    The terms morphome and morphomics are not new but, recently, a group of morphologists and cell biologists has given them clear definitions and emphasised their integral importance in systems biology. By analogy to other ‘-omes’, the morphome refers to the distribution of matter within 3-dimensional (3D) space. It equates to the totality of morphological features within a biological system (virus, single cell, multicellular organism or populations thereof) and morphomics is the systematic study of those structures. Morphomics research has the potential to generate ‘big data’ because it includes all imaging techniques at all levels of achievable resolution and all structural scales from gross anatomy and medical imaging, via optical and electron microscopy, to molecular characterisation. As with other ‘-omics’, quantification is an important part of morphomics and, because biological systems exist and operate in 3D space, precise descriptions of form, content and spatial relationships require the quantification of structure in 3D. Revealing and quantifying structural detail inside the specimen is achieved currently in two main ways: (i) by some form of reconstruction from serial physical or tomographic slices or (ii) by using randomly-sampled sections and simple test probes (points, lines, areas, volumes) to derive stereological estimates of global and/or individual quantities. The latter include volumes, surfaces, lengths and numbers of interesting features and spatial relationships between them. This article emphasises the value of stereological design, sampling principles and estimation tools as a template for combining with alternative imaging techniques to tackle the ‘big data’ issue and advance knowledge and understanding of the morphome. The combination of stereology, TEM and immunogold cytochemistry provides a practical illustration of how this has been achieved in the sub-field of nanomorphomics. Applying these quantitative tools/techniques in a carefully managed study design offers us a deeper appreciation of the spatiotemporal relationships between the genome, metabolome and morphome which are integral to systems biology. PMID:25753334

  15. Quantification of differential gene expression by multiplexed targeted resequencing of cDNA

    PubMed Central

    Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.

    2017-01-01

    Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677

  16. TRACI THE TOOL FOR THE REDUCTION AND ASSESSMENT OF CHEMICAL AND OTHER ENVIRONMENTAL IMPACTS - VERSION 2 CHANGES

    EPA Science Inventory

    The Tool for the Reduction and Assessment of Chemical and other environmental Impacts (TRACI) was developed to allow the quantification of environmental impacts for a variety of impact categories which are necessary for a comprehensive impact assessment. See Figure 1. TRACI is c...

  17. MultiMap: A Tool to Automatically Extract and Analyse Spatial Microscopic Data From Large Stacks of Confocal Microscopy Images

    PubMed Central

    Varando, Gherardo; Benavides-Piccione, Ruth; Muñoz, Alberto; Kastanauskaite, Asta; Bielza, Concha; Larrañaga, Pedro; DeFelipe, Javier

    2018-01-01

    The development of 3D visualization and reconstruction methods to analyse microscopic structures at different levels of resolutions is of great importance to define brain microorganization and connectivity. MultiMap is a new tool that allows the visualization, 3D segmentation and quantification of fluorescent structures selectively in the neuropil from large stacks of confocal microscopy images. The major contribution of this tool is the posibility to easily navigate and create regions of interest of any shape and size within a large brain area that will be automatically 3D segmented and quantified to determine the density of puncta in the neuropil. As a proof of concept, we focused on the analysis of glutamatergic and GABAergic presynaptic axon terminals in the mouse hippocampal region to demonstrate its use as a tool to provide putative excitatory and inhibitory synaptic maps. The segmentation and quantification method has been validated over expert labeled images of the mouse hippocampus and over two benchmark datasets, obtaining comparable results to the expert detections. PMID:29875639

  18. MultiMap: A Tool to Automatically Extract and Analyse Spatial Microscopic Data From Large Stacks of Confocal Microscopy Images.

    PubMed

    Varando, Gherardo; Benavides-Piccione, Ruth; Muñoz, Alberto; Kastanauskaite, Asta; Bielza, Concha; Larrañaga, Pedro; DeFelipe, Javier

    2018-01-01

    The development of 3D visualization and reconstruction methods to analyse microscopic structures at different levels of resolutions is of great importance to define brain microorganization and connectivity. MultiMap is a new tool that allows the visualization, 3D segmentation and quantification of fluorescent structures selectively in the neuropil from large stacks of confocal microscopy images. The major contribution of this tool is the posibility to easily navigate and create regions of interest of any shape and size within a large brain area that will be automatically 3D segmented and quantified to determine the density of puncta in the neuropil. As a proof of concept, we focused on the analysis of glutamatergic and GABAergic presynaptic axon terminals in the mouse hippocampal region to demonstrate its use as a tool to provide putative excitatory and inhibitory synaptic maps. The segmentation and quantification method has been validated over expert labeled images of the mouse hippocampus and over two benchmark datasets, obtaining comparable results to the expert detections.

  19. Quantification of lung tumor rotation with automated landmark extraction using orthogonal cine MRI images

    NASA Astrophysics Data System (ADS)

    Paganelli, Chiara; Lee, Danny; Greer, Peter B.; Baroni, Guido; Riboldi, Marco; Keall, Paul

    2015-09-01

    The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were  -0.6   ±   2.3° and  -1.5   ±   2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment.

  20. Photoacoustic Spectroscopy as a Non-destructive Tool for Quantification of Pesticide Residue in Apple Cuticle

    NASA Astrophysics Data System (ADS)

    Liu, Lixian; Wang, Yafei; Gao, Chunming; Huan, Huiting; Zhao, Binxing; Yan, Laijun

    2015-06-01

    Photoacoustic spectroscopy (PAS), the non-destructive method to detect residue of dimethyl-dichloro-vinyl-phosphate (DDVP) pesticide in a cuticle of apple, is described. After constructing the PA experimental setup and identifying three characteristic peaks of DDVP in the near ultraviolet region, the PA spectra of an apple cuticle contaminated with DDVP were collected. The artificial neural network method was then applied to analyze data quantitatively. The results show a correlation coefficient exceeding 0.99 and a detection limit of 0.2 ppm, which is within the national food safety standard for maximum residue limits for pesticides in food (GB 2763-2012). This fact and the non-destructive character of PAS make the approach promising for detection of pesticide residue in fruits.

  1. An empirical model for dissolution profile and its application to floating dosage forms.

    PubMed

    Weiss, Michael; Kriangkrai, Worawut; Sungthongjeen, Srisagul

    2014-06-02

    A sum of two inverse Gaussian functions is proposed as a highly flexible empirical model for fitting of in vitro dissolution profiles. The model was applied to quantitatively describe theophylline release from effervescent multi-layer coated floating tablets containing different amounts of the anti-tacking agents talc or glyceryl monostearate. Model parameters were estimated by nonlinear regression (mixed-effects modeling). The estimated parameters were used to determine the mean dissolution time, as well as to reconstruct the time course of release rate for each formulation, whereby the fractional release rate can serve as a diagnostic tool for classification of dissolution processes. The approach allows quantification of dissolution behavior and could provide additional insights into the underlying processes. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Time resolved analysis of quetiapine and 7-OH-quetiapine in hair using LC/MS-MS.

    PubMed

    Binz, Tina M; Yegles, Michel; Schneider, Serge; Neels, Hugo; Crunelle, Cleo L

    2014-09-01

    Hair analysis is a powerful tool for retrospective drug analysis and has a wide application window. This article describes the simultaneous determination and quantification of the short-acting atypical antipsychotic drug quetiapine and its main metabolite 7-OH quetiapine in hair. A sensitive and accurate method for the determination of these two compounds was developed using high-performance liquid chromatography coupled to tandem mass spectrometry detection (LC-MS/MS). The method was applied to 10 real case samples. For five patients, a time resolved hair analysis was done. Results varied from 0.35 ng/mg to 10.21 ng/mg hair for quetiapine and from 0.02 ng/mg to 3.19 ng/mg hair for 7-OH-quetiapine. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    PubMed

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  4. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS

    PubMed Central

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698

  5. Quantification of anthropogenic impact on groundwater dependent terrestrial ecosystem using geochemical and isotope tools combined with 3-D flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Zurek, A. J.; Witczak, S.; Dulinski, M.; Wachniew, P.; Rozanski, K.; Kania, J.; Postawa, A.; Karczewski, J.; Moscicki, W. J.

    2014-08-01

    A dedicated study was launched in 2010 with the main aim to better understand the functioning of groundwater dependent terrestrial ecosystem (GDTE) located in southern Poland. The GDTE consists of a valuable forest stand (Niepolomice Forest) and associated wetland (Wielkie Bloto fen). A wide range of tools (environmental tracers, geochemistry, geophysics, 3-D flow and transport modeling) was used. The research was conducted along three major directions: (i) quantification of the dynamics of groundwater flow in various parts of the aquifer associated with GDTE, (ii) quantification of the degree of interaction between the GDTE and the aquifer, and (iii) 3-D modeling of groundwater flow in the vicinity of the studied GDTE and quantification of possible impact of enhanced exploitation of the aquifer on the status of GDTE. Environmental tracer data (tritium, stable isotopes of water) strongly suggest that upward leakage of the aquifer contributes significantly to the present water balance of the studied wetland and associated forest. Physico-chemical parameters of water (pH, conductivity, Na / Cl ratio) confirm this notion. Model runs indicate that prolonged groundwater abstraction through the newly-established network of water supply wells, conducted at maximum permitted capacity (ca. 10 000 m3 d-1), may trigger drastic changes in the ecosystem functioning, eventually leading to its degradation.

  6. Quantification of brain lipids by FTIR spectroscopy and partial least squares regression

    NASA Astrophysics Data System (ADS)

    Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph

    2009-01-01

    Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.

  7. Absolute quantification by droplet digital PCR versus analog real-time PCR

    PubMed Central

    Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh

    2014-01-01

    Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387

  8. A study of a self diagnostic platform for the detection of A2 biomarker for Leishmania donovani

    NASA Astrophysics Data System (ADS)

    Roche, Philip J. R.; Cheung, Maurice C.; Najih, Mohamed; McCall, Laura-Isobel; Fakih, Ibrahim; Chodavarapu, Vamsy P.; Ward, Brian; Ndao, Momar; Kirk, Andrew G.

    2012-03-01

    Visceral leishmaniasis (L.donovani) is a protozoan infection that attacks mononuclear phagocytes and causes the liver and spleen damage that can cause death. The investigation presented is a proof of concept development applying a plasmonic diagnostic platform with simple microfluidic sample delivery and optical readout. An immune-assay method is applied to the quantification of A2 protein, a highly immunogenic biomarker for the pathogen. Quantification of A2 was performed in the ng/ml range, analysis by ELISA suggested that a limit of 0.1ng/ml of A2 is approximate to 1 pathogen per ml and the sensing system shows the potential to deliver a similar level of quantification. Significant reduction in assay complexity as further enzyme linked enhancement is not required when applying a plasmonic methodology to an immunoassay. The basic instrumentation required for a portable device and potential dual optical readout where both plasmonic and photoluminescent response are assessed and investigated including consideration of the application of the device to testing where non-literate communication of results is considered and issues of performance are addressed.

  9. An ultra-high pressure liquid chromatography-tandem mass spectrometry method for the quantification of teicoplanin in plasma of neonates.

    PubMed

    Begou, O; Kontou, A; Raikos, N; Sarafidis, K; Roilides, E; Papadoyannis, I N; Gika, H G

    2017-03-15

    The development and validation of an ultra-high pressure liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was performed with the aim to be applied for the quantification of plasma teicoplanin concentrations in neonates. Pharmacokinetic data of teicoplanin in the neonatal population is very limited, therefore, a sensitive and reliable method for the determination of all isoforms of teicoplanin applied in a low volume of sample is of real importance. Teicoplanin main components were extracted by a simple acetonitrile precipitation step and analysed on a C18 chromatographic column by a triple quadrupole MS with electrospray ionization. The method provides quantitative data over a linear range of 25-6400ng/mL with LOD 8.5ng/mL and LOQ 25ng/mL for total teicoplanin. The method was applied in plasma samples from neonates to support pharmacokinetic data and proved to be a reliable and fast method for the quantification of teicoplanin concentration levels in plasma of infants during therapy in Intensive Care Unit. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.

    PubMed

    Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian

    2017-04-01

    Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques revealed a distinct relation of measured collagen and utilized quantification method as well as section thickness. Besides electron microscopy-stereology, which was precise and sensitive, light microscopy-stereology and automated image analysis proved to be appropriate for collagen quantification. Moreover, consideration of collagen localization might be important in revealing minor fibrotic changes. Copyright © 2017 the American Physiological Society.

  11. Semi-supervised Machine Learning for Analysis of Hydrogeochemical Data and Models

    NASA Astrophysics Data System (ADS)

    Vesselinov, Velimir; O'Malley, Daniel; Alexandrov, Boian; Moore, Bryan

    2017-04-01

    Data- and model-based analyses such as uncertainty quantification, sensitivity analysis, and decision support using complex physics models with numerous model parameters and typically require a huge number of model evaluations (on order of 10^6). Furthermore, model simulations of complex physics may require substantial computational time. For example, accounting for simultaneously occurring physical processes such as fluid flow and biogeochemical reactions in heterogeneous porous medium may require several hours of wall-clock computational time. To address these issues, we have developed a novel methodology for semi-supervised machine learning based on Non-negative Matrix Factorization (NMF) coupled with customized k-means clustering. The algorithm allows for automated, robust Blind Source Separation (BSS) of groundwater types (contamination sources) based on model-free analyses of observed hydrogeochemical data. We have also developed reduced order modeling tools, which coupling support vector regression (SVR), genetic algorithms (GA) and artificial and convolutional neural network (ANN/CNN). SVR is applied to predict the model behavior within prior uncertainty ranges associated with the model parameters. ANN and CNN procedures are applied to upscale heterogeneity of the porous medium. In the upscaling process, fine-scale high-resolution models of heterogeneity are applied to inform coarse-resolution models which have improved computational efficiency while capturing the impact of fine-scale effects at the course scale of interest. These techniques are tested independently on a series of synthetic problems. We also present a decision analysis related to contaminant remediation where the developed reduced order models are applied to reproduce groundwater flow and contaminant transport in a synthetic heterogeneous aquifer. The tools are coded in Julia and are a part of the MADS high-performance computational framework (https://github.com/madsjulia/Mads.jl).

  12. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  13. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    NASA Astrophysics Data System (ADS)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  14. Microbes a Tool for the Remediation of Organotin Pollution Determined by Static Headspace Gas Chromatography-Mass Spectrometry.

    PubMed

    Finnegan, Christopher; Ryan, David; Enright, Anne-Marie; Garcia-Cabellos, Guiomar

    2018-03-10

    Tributyltin (TBT) is one of the most toxic anthropogenic compounds introduced into the marine environment. Despite its global ban in 2008, TBT is still a problem of great concern due to its high affinity for particulate matter, providing a direct and potentially persistent route of entry into benthic sediments. Bioremediation strategies may constitute an alternative approach to conventional physicochemical methods, benefiting from the microorganism's potential to metabolize anthropogenic compounds. In this work, a simple, precise and accurate static headspace gas chromatography method was developed to investigate the ability of TBT degrading microbes in sedimentary microcosms over a period of 120 days. The proposed method was validated for linearity, repeatability, accuracy, specificity, limit of detection and limit of quantification. The method was subsequently successfully applied for the detection and quantification of TBT and degradation compounds in sediment samples on day 0, 30, 60, 90 and 120 of the experiment employing the principles of green chemistry. On day 120 the concentration of TBT remaining in the microcosms ranged between 91.91 ng/g wet wt for the least effective microbial inoculant to 52.73 ng/g wet wt for the most effective microbial inoculant from a starting concentration of 100 ng/g wet wt.

  15. The Effects of Statistical Multiplicity of Infection on Virus Quantification and Infectivity Assays.

    PubMed

    Mistry, Bhaven A; D'Orsogna, Maria R; Chou, Tom

    2018-06-19

    Many biological assays are employed in virology to quantify parameters of interest. Two such classes of assays, virus quantification assays (VQAs) and infectivity assays (IAs), aim to estimate the number of viruses present in a solution and the ability of a viral strain to successfully infect a host cell, respectively. VQAs operate at extremely dilute concentrations, and results can be subject to stochastic variability in virus-cell interactions. At the other extreme, high viral-particle concentrations are used in IAs, resulting in large numbers of viruses infecting each cell, enough for measurable change in total transcription activity. Furthermore, host cells can be infected at any concentration regime by multiple particles, resulting in a statistical multiplicity of infection and yielding potentially significant variability in the assay signal and parameter estimates. We develop probabilistic models for statistical multiplicity of infection at low and high viral-particle-concentration limits and apply them to the plaque (VQA), endpoint dilution (VQA), and luciferase reporter (IA) assays. A web-based tool implementing our models and analysis is also developed and presented. We test our proposed new methods for inferring experimental parameters from data using numerical simulations and show improvement on existing procedures in all limits. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. DBS-platform for biomonitoring and toxicokinetics of toxicants: proof of concept using LC-MS/MS analysis of fipronil and its metabolites in blood

    NASA Astrophysics Data System (ADS)

    Raju, Kanumuri Siva Rama; Taneja, Isha; Rashid, Mamunur; Sonkar, Ashish Kumar; Wahajuddin, Muhammad; Singh, Sheelendra Pratap

    2016-03-01

    A simple, sensitive and high throughput LC-MS/MS method was developed and validated for quantification of fipronil, fipronil sulfone and fipronil desulfinyl in rat and human dried blood spots (DBS). DBS samples were prepared by spiking 10 μl blood on DMPK-C cards followed by drying at room temperature. The whole blood spots were then punched from the card and extracted using acetonitrile. The total chromatographic run time of the method was only 2 min. The lower limit of quantification of the method was 0.1 ng/ml for all the analytes. The method was successfully applied to determine fipronil desulfinyl in DBS samples obtained from its toxicokinetic study in rats following intravenous dose (1 mg/kg). In conclusion, the proposed DBS methodology has significant potential in toxicokinetics and biomonitoring studies of environmental toxicants. This microvolume DBS technique will be an ideal tool for biomonitoring studies, particularly in paediatric population. Small volume requirements, minimally invasive blood sampling method, easier storage and shipping procedure make DBS a suitable technique for such studies. Further, DBS technique contributes towards the principles of 3Rs resulting in significant reduction in the number of rodents used and refinement in sample collection for toxicokinetic studies.

  17. Engendering drug problems: Materialising gender in the DUDIT and other screening and diagnostic 'apparatuses'.

    PubMed

    Dwyer, Robyn; Fraser, Suzanne

    2017-06-01

    It is widely accepted that alcohol and other drug consumption is profoundly gendered. Just where this gendering is occurring, however, remains the subject of debate. We contend that one important and overlooked site where the gendering of substance consumption and addiction is taking place is through AOD research itself: in particular, through the addiction screening and diagnostic tools designed to measure and track substance consumption and problems within populations. These tools establish key criteria and set numerical threshold scores for the identification of problems. In many of these tools, separate threshold scores for women and men are established or recommended. Drawing on Karen Barad's concept of post-humanist performativity, in this article we examine the ways in which gender itself is being materialised by these apparatuses of measurement. We focus primarily on the Drug Use Disorders Identification Test (DUDIT) tool as an exemplar of gendering processes that operate across addiction tools more broadly. We consider gendering processes operating through tools questions themselves and we also examine the quantification and legitimation processes used in establishing gender difference and the implications these have for women. We find tools rely on and reproduce narrow and marginalising assumptions about women as essentially fragile and vulnerable and simultaneously reinforce normative expectations that women sacrifice pleasure. The seemingly objective and neutral quantification processes operating in tools naturalise gender as they enact it. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Quantification of the rates of resynchronization of heart rate with body temperature rhythms in man following a photoperiod shift

    NASA Technical Reports Server (NTRS)

    Hetherington, N. W.; Rosenblatt, L. S.; Higgins, E. A.; Winget, C. M.

    1973-01-01

    A mathematical model previously presented by Rosenblatt et al. (1973) for estimating the rates of resynchronization of individual biorhythms following transmeridian flights or photoperiod shifts is extended to estimation of rates at which two biorythms resynchronize with respect to each other. Such quantification of the rate of restoration of the initial phase relationship of the two biorhythms is pointed out as a valuable tool in the study of internal desynchronosis.

  19. Positron emission tomography in cardiovascular disease.

    PubMed

    Beanlands, R

    1996-10-01

    Positron emission tomography (PET) represents an advanced form of nuclear imaging technology. The use of positron emitting isotopes, such as C-11, O-15, N-13, and F-18 permit radiolabelling of naturally occurring compounds in the body or close analogues. This, combined with technical advantages of PET imaging, allow quantification of physiological processes in humans. PET has become established as the most accurate noninvasive means for the diagnosis of coronary artery disease using myocardial perfusion radiotracers, which include rubidium-82, N-13-amonia, and O-15-water. These approaches have also been applied for long term evaluation of the effects of therapy and for the quantification of myocardial bloodflow. Radiolabelling of metabolic substrates, including C-11 palmitate, C-11 acetate and F-18 flurodeoxyglucose (FDG) have permitted evaluation of myocardial metabolism. F-18 FDG PET imaging has been established as the best means for defining viable myocardium in patients with reduced ventricular function being considered for revascularization. FDG PET can also identify patients being considered for cardiac transplant, who may be candidates for revascularization. In this review, other applications for metabolic, autonomic nervous system and receptor imaging are also discussed. The availability of cardiac PET in Canada is currently limited. However, with the reducing costs of capital and more cost effectiveness data, PET may become more widely available. Cardiac PET imaging is established as a tremendous diagnostic tool for defining viable myocardium, assessment of perfusion and long term evaluation of therapy without invasive procedures. PET is also a vital research tool capable of evaluating flow, metabolism, myocardial receptors, autonomic nervous system and potentially radiolabelled drugs. Cardiac PET imaging will continue to provide important insight, expanding our understanding and treatment of patients with cardiovascular disease.

  20. Three-dimensional anthropometric techniques applied to the fabrication of burn masks and the quantification of wound healing

    NASA Astrophysics Data System (ADS)

    Whitestone, Jennifer J.; Geisen, Glen R.; McQuiston, Barbara K.

    1997-03-01

    Anthropometric surveys conducted by the military provide comprehensive human body measurement data that are human interface requirements for successful mission performance of weapon systems, including cockpits, protective equipment, and clothing. The application of human body dimensions to model humans and human-machine performance begins with engineering anthropometry. There are two critical elements to engineering anthropometry: data acquisition and data analysis. First, the human body is captured dimensionally with either traditional anthropometric tools, such as calipers and tape measures, or with advanced image acquisition systems, such as a laser scanner. Next, numerous statistical analysis tools, such as multivariate modeling and feature envelopes, are used to effectively transition these data for design and evaluation of equipment and work environments. Recently, Air Force technology transfer allowed researchers at the Computerized Anthropometric Research and Design (CARD) Laboratory at Wright-Patterson Air Force Base to work with the Dayton, Ohio area medical community in assessing the rate of wound healing and improving the fit of total contract burn masks. This paper describes the successful application of CARD Lab engineering anthropometry to two medically oriented human interface problems.

  1. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.

    PubMed

    Chahrour, Osama; Malone, John

    2017-01-01

    Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Quantification of free fatty acids in human stratum corneum using tandem mass spectrometry and surrogate analyte approach.

    PubMed

    Dapic, Irena; Kobetic, Renata; Brkljacic, Lidija; Kezic, Sanja; Jakasa, Ivone

    2018-02-01

    The free fatty acids (FFAs) are one of the major components of the lipids in the stratum corneum (SC), the uppermost layer of the skin. Relative composition of FFAs has been proposed as a biomarker of the skin barrier status in patients with atopic dermatitis (AD). Here, we developed an LC-ESI-MS/MS method for simultaneous quantification of a range of FFAs with long and very long chain length in the SC collected by adhesive tape (D-Squame). The method, based on derivatization with 2-bromo-1-methylpyridinium iodide and 3-carbinol-1-methylpyridinium iodide, allowed highly sensitive detection and quantification of FFAs using multiple reaction monitoring. For the quantification, we applied a surrogate analyte approach and internal standardization using isotope labeled derivatives of FFAs. Adhesive tapes showed the presence of several FFAs, which are also present in the SC, a problem encountered in previous studies. Therefore, the levels of FFAs in the SC were corrected using C12:0, which was present on the adhesive tape, but not detected in the SC. The method was applied to SC samples from patients with atopic dermatitis and healthy subjects. Quantification using multiple reaction monitoring allowed sufficient sensitivity to analyze FFAs of chain lengths C16-C28 in the SC collected on only one tape strip. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Multiplex quantification of 12 European Union authorized genetically modified maize lines with droplet digital polymerase chain reaction.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana

    2015-08-18

    Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.

  4. Fluorescent quantification of melanin.

    PubMed

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    PubMed

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  6. Sequential Design of Experiments to Maximize Learning from Carbon Capture Pilot Plant Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soepyan, Frits B.; Morgan, Joshua C.; Omell, Benjamin P.

    Pilot plant test campaigns can be expensive and time-consuming. Therefore, it is of interest to maximize the amount of learning and the efficiency of the test campaign given the limited number of experiments that can be conducted. This work investigates the use of sequential design of experiments (SDOE) to overcome these challenges by demonstrating its usefulness for a recent solvent-based CO2 capture plant test campaign. Unlike traditional design of experiments methods, SDOE regularly uses information from ongoing experiments to determine the optimum locations in the design space for subsequent runs within the same experiment. However, there are challenges that needmore » to be addressed, including reducing the high computational burden to efficiently update the model, and the need to incorporate the methodology into a computational tool. We address these challenges by applying SDOE in combination with a software tool, the Framework for Optimization, Quantification of Uncertainty and Surrogates (FOQUS) (Miller et al., 2014a, 2016, 2017). The results of applying SDOE on a pilot plant test campaign for CO2 capture suggests that relative to traditional design of experiments methods, SDOE can more effectively reduce the uncertainty of the model, thus decreasing technical risk. Future work includes integrating SDOE into FOQUS and using SDOE to support additional large-scale pilot plant test campaigns.« less

  7. A quantitative witness for Greenberger-Horne-Zeilinger entanglement.

    PubMed

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.

  8. A quantitative witness for Greenberger-Horne-Zeilinger entanglement

    PubMed Central

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431

  9. Application of Targeted Mass Spectrometry for the Quantification of Sirtuins in the Central Nervous System

    NASA Astrophysics Data System (ADS)

    Jayasena, T.; Poljak, A.; Braidy, N.; Zhong, L.; Rowlands, B.; Muenchhoff, J.; Grant, R.; Smythe, G.; Teo, C.; Raftery, M.; Sachdev, P.

    2016-10-01

    Sirtuin proteins have a variety of intracellular targets, thereby regulating multiple biological pathways including neurodegeneration. However, relatively little is currently known about the role or expression of the 7 mammalian sirtuins in the central nervous system. Western blotting, PCR and ELISA are the main techniques currently used to measure sirtuin levels. To achieve sufficient sensitivity and selectivity in a multiplex-format, a targeted mass spectrometric assay was developed and validated for the quantification of all seven mammalian sirtuins (SIRT1-7). Quantification of all peptides was by multiple reaction monitoring (MRM) using three mass transitions per protein-specific peptide, two specific peptides for each sirtuin and a stable isotope labelled internal standard. The assay was applied to a variety of samples including cultured brain cells, mammalian brain tissue, CSF and plasma. All sirtuin peptides were detected in the human brain, with SIRT2 being the most abundant. Sirtuins were also detected in human CSF and plasma, and guinea pig and mouse tissues. In conclusion, we have successfully applied MRM mass spectrometry for the detection and quantification of sirtuin proteins in the central nervous system, paving the way for more quantitative and functional studies.

  10. Quantification of Pulmonary Inflammatory Processes Using Chest Radiography: Tuberculosis as the Motivating Application

    PubMed Central

    Giacomini, Guilherme; Miranda, José R.A.; Pavan, Ana Luiza M.; Duarte, Sérgio B.; Ribeiro, Sérgio M.; Pereira, Paulo C.M.; Alves, Allan F.F.; de Oliveira, Marcela; Pina, Diana R.

    2015-01-01

    Abstract The purpose of this work was to develop a quantitative method for evaluating the pulmonary inflammatory process (PIP) through the computational analysis of chest radiography exams in posteroanterior (PA) and lateral views. The quantification procedure was applied to patients with tuberculosis (TB) as the motivating application. A study of high-resolution computed tomography (HRCT) examinations of patients with TB was developed to establish a relation between the inflammatory process and the signal difference-to-noise ratio (SDNR) measured in the PA projection. A phantom essay was used to validate this relation, which was implemented using an algorithm that is able to estimate the volume of the inflammatory region based solely on SDNR values in the chest radiographs of patients. The PIP volumes that were quantified for 30 patients with TB were used for comparisons with direct HRCT analysis for the same patient. The Bland–Altman statistical analyses showed no significant differences between the 2 quantification methods. The linear regression line had a correlation coefficient of R2 = 0.97 and P < 0.001, showing a strong association between the volume that was determined by our evaluation method and the results obtained by direct HRCT scan analysis. Since the diagnosis and follow-up of patients with TB is commonly performed using X-rays exams, the method developed herein can be considered an adequate tool for quantifying the PIP with a lower patient radiation dose and lower institutional cost. Although we used patients with TB for the application of the method, this method may be used for other pulmonary diseases characterized by a PIP. PMID:26131814

  11. Morphomics: An integral part of systems biology of the human placenta.

    PubMed

    Mayhew, T M

    2015-04-01

    The placenta is a transient organ the functioning of which has health consequences far beyond the embryo/fetus. Understanding the biology of any system (organ, organism, single cell, etc) requires a comprehensive and inclusive approach which embraces all the biomedical disciplines and 'omic' technologies and then integrates information obtained from all of them. Among the latest 'omics' is morphomics. The terms morphome and morphomics have been applied incoherently in biology and biomedicine but, recently, they have been given clear and widescale definitions. Morphomics is placed in the context of other 'omics' and its pertinent technologies and tools for sampling and quantitation are reviewed. Emphasis is accorded to the importance of random sampling principles in systems biology and the value of combining 3D quantification with alternative imaging techniques to advance knowledge and understanding of the human placental morphome. By analogy to other 'omes', the morphome is the totality of morphological features within a system and morphomics is the systematic study of those structures. Information about structure is required at multiple levels of resolution in order to understand better the processes by which a given system alters with time, experimental treatment or environmental insult. Therefore, morphomics research includes all imaging techniques at all levels of achievable resolution from gross anatomy and medical imaging, via optical and electron microscopy, to molecular characterisation. Quantification is an important element of all 'omics' studies and, because biological systems exist and operate in 3-dimensional (3D) space, precise descriptions of form, content and spatial relationships require the quantification of structure in 3D. These considerations are relevant to future study contributions to the Human Placenta Project. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Quantification and characterization of glyphosate use and loss in a residential area.

    PubMed

    Tang, Ting; Boënne, Wesley; Desmet, Nele; Seuntjens, Piet; Bronders, Jan; van Griensven, Ann

    2015-06-01

    Urban runoff can be a significant source of pesticides in urban streams. However, quantification of this source has been difficult because pesticide use by urban residents (e.g., on pavements or in gardens) is often unknown, particularly at the scale of a residential catchment. Proper quantification and characterization of pesticide loss via urban runoff require sound information on the use and occurrence of pesticides at hydrologically-relevant spatial scales, involving various hydrological conditions. We conducted a monitoring study in a residential area (9.5 ha, Flanders, Belgium) to investigate the use and loss of a widely-used herbicide (glyphosate) and its major degradation product (aminomethylphosphonic acid, AMPA). The study covered 13 rainfall events over 67 days. Overall, less than 0.5% of glyphosate applied was recovered from the storm drain outflow in the catchment. Maximum detected concentrations were 6.1 μg/L and 5.8 μg/L for glyphosate and AMPA, respectively, both of which are below the predicted no-effect concentration for surface water proposed by the Flemish environmental agency (10 μg/L), but are above the EU drinking water standard (0.1 μg/L). The measured concentrations and percentage loss rates can be attributed partially to the strong sorption capacity of glyphosate and low runoff potential in the study area. However, glyphosate loss varied considerably among rainfall events and event load of glyphosate mass was mainly controlled by rainfall amount, according to further statistical analyses. To obtain urban pesticide management insights, robust tools are required to investigate the loss and occurrence of pesticides influenced by various factors, particularly the hydrological and spatial factors. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Quantitative Detection of Streptococcus pneumoniae in Nasopharyngeal Secretions by Real-Time PCR

    PubMed Central

    Greiner, Oliver; Day, Philip J. R.; Bosshard, Philipp P.; Imeri, Fatime; Altwegg, Martin; Nadal, David

    2001-01-01

    Streptococcus pneumoniae is an important cause of community-acquired pneumonia. However, in this setting the diagnostic sensitivity of blood cultures is below 30%. Since during such infections changes in the amounts of S. pneumoniae may also occur in the upper respiratory tract, quantification of these bacteria in nasopharnygeal secretions (NPSs) may offer a suitable diagnostic approach. Real-time PCR offers a sensitive, efficient, and routinely reproducible approach to quantification. Using primers and a fluorescent probe specific for the pneumolysin gene, we were able to detect DNA from serial dilutions of S. pneumoniae cells in which the quantities of DNA ranged from the amounts extracted from 1 to 106 cells. No difference was noted when the same DNA was mixed with DNA extracted from NPSs shown to be deficient of S. pneumoniae following culture, suggesting that this bacterium can be detected and accurately quantitated in clinical samples. DNAs from Haemophilus influenzae, Moraxella catarrhalis, or alpha-hemolytic streptococci other than S. pneumoniae were not amplified or were only weakly amplified when there were ≥106 cells per reaction mixture. When the assay was applied to NPSs from patients with respiratory tract infections, the assay performed with a sensitivity of 100% and a specificity of up to 96% compared to the culture results. The numbers of S. pneumoniae organisms detected by real-time PCR correlated with the numbers detected by semiquantitative cultures. A real-time PCR that targeted the pneumolysin gene provided a sensitive and reliable means for routine rapid detection and quantification of S. pneumoniae present in NPSs. This assay may serve as a tool to study changes in the amounts of S. pneumoniae during lower respiratory tract infections. PMID:11526140

  14. Virtual touch tissue quantification using acoustic radiation force impulse technology: initial clinical experience with solid breast masses.

    PubMed

    Bai, Min; Du, Lianfang; Gu, Jiying; Li, Fan; Jia, Xiao

    2012-02-01

    The purpose of this study was to investigate the clinical usage of Virtual Touch tissue quantification (VTQ; Siemens Medical Solutions, Mountain View, CA) implementing sonographic acoustic radiation force impulse technology for differentiation between benign and malignant solid breast masses. A total of 143 solid breast masses were examined with VTQ, and their shear wave velocities (SWVs) were measured. From all of the masses, 30 were examined by two independent operators to evaluate the reproducibility of the results of VTQ measurement. All masses were later surgically resected, and the histologic results were correlated with the SWV results. A receiver operating characteristic curve was calculated to assess the diagnostic performance of VTQ. A total of 102 benign lesions and 41 carcinomas were diagnosed on the basis of histologic examination. The VTQ measurements performed by the two independent operators yielded a correlation coefficient of 0.885. Applying a cutoff point of 3.065 m/s, a significant difference (P < .001) was found between the SWVs of the benign (mean ± SD, 2.25 ± 0.59 m/s) and malignant (5.96 ± 2.96 m/s) masses. The sensitivity, specificity, and area under the receiver operating characteristic curve for the differentiation were 75.6%, 95.1%, and 85.6%, respectively. When the repeated non-numeric result X.XX of the SWV measurements was designated as an indicator of malignancy, the sensitivity, specificity, and accuracy were 63.4%, 100%, and 89.5%. Virtual Touch tissue quantification can yield reproducible and quantitative diagnostic information on solid breast masses and serve as an effective diagnostic tool for differentiation between benign and malignant solid masses.

  15. cFinder: definition and quantification of multiple haplotypes in a mixed sample.

    PubMed

    Niklas, Norbert; Hafenscher, Julia; Barna, Agnes; Wiesinger, Karin; Pröll, Johannes; Dreiseitl, Stephan; Preuner-Stix, Sandra; Valent, Peter; Lion, Thomas; Gabriel, Christian

    2015-09-07

    Next-generation sequencing allows for determining the genetic composition of a mixed sample. For instance, when performing resistance testing for BCR-ABL1 it is necessary to identify clones and define compound mutations; together with an exact quantification this may complement diagnosis and therapy decisions with additional information. Moreover, that applies not only to oncological issues but also determination of viral, bacterial or fungal infection. The efforts to retrieve multiple haplotypes (more than two) and proportion information from data with conventional software are difficult, cumbersome and demand multiple manual steps. Therefore, we developed a tool called cFinder that is capable of automatic detection of haplotypes and their accurate quantification within one sample. BCR-ABL1 samples containing multiple clones were used for testing and our cFinder could identify all previously found clones together with their abundance and even refine some results. Additionally, reads were simulated using GemSIM with multiple haplotypes, the detection was very close to linear (R(2) = 0.96). Our aim is not to deduce haploblocks over statistics, but to characterize one sample's composition precisely. As a result the cFinder reports the connections of variants (haplotypes) with their readcount and relative occurrence (percentage). Download is available at http://sourceforge.net/projects/cfinder/. Our cFinder is implemented in an efficient algorithm that can be run on a low-performance desktop computer. Furthermore, it considers paired-end information (if available) and is generally open for any current next-generation sequencing technology and alignment strategy. To our knowledge, this is the first software that enables researchers without extensive bioinformatic support to designate multiple haplotypes and how they constitute to a sample.

  16. Quantification of the biocontrol agent Trichoderma harzianum with real-time TaqMan PCR and its potential extrapolation to the hyphal biomass.

    PubMed

    López-Mondéjar, Rubén; Antón, Anabel; Raidl, Stefan; Ros, Margarita; Pascual, José Antonio

    2010-04-01

    The species of the genus Trichoderma are used successfully as biocontrol agents against a wide range of phytopathogenic fungi. Among them, Trichoderma harzianum is especially effective. However, to develop more effective fungal biocontrol strategies in organic substrates and soil, tools for monitoring the control agents are required. Real-time PCR is potentially an effective tool for the quantification of fungi in environmental samples. The aim of this study consisted of the development and application of a real-time PCR-based method to the quantification of T. harzianum, and the extrapolation of these data to fungal biomass values. A set of primers and a TaqMan probe for the ITS region of the fungal genome were designed and tested, and amplification was correlated to biomass measurements obtained with optical microscopy and image analysis, of the hyphal length of the mycelium of the colony. A correlation of 0.76 between ITS copies and biomass was obtained. The extrapolation of the quantity of ITS copies, calculated based on real-time PCR data, into quantities of fungal biomass provides potentially a more accurate value of the quantity of soil fungi. Copyright 2009 Elsevier Ltd. All rights reserved.

  17. Quantification of complex modular architecture in plants.

    PubMed

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  18. Immunohistochemistry as an Important Tool in Biomarkers Detection and Clinical Practice

    PubMed Central

    de Matos, Leandro Luongo; Trufelli, Damila Cristina; de Matos, Maria Graciela Luongo; da Silva Pinhal, Maria Aparecida

    2010-01-01

    The immunohistochemistry technique is used in the search for cell or tissue antigens that range from amino acids and proteins to infectious agents and specific cellular populations. The technique comprises two phases: (1) slides preparation and stages involved for the reaction; (2) interpretation and quantification of the obtained expression. Immunohistochemistry is an important tool for scientific research and also a complementary technique for the elucidation of differential diagnoses which are not determinable by conventional analysis with hematoxylin and eosin. In the last couple of decades there has been an exponential increase in publications on immunohistochemistry and immunocytochemistry techniques. This review covers the immunohistochemistry technique; its history, applications, importance, limitations, difficulties, problems and some aspects related to results interpretation and quantification. Future developments on the immunohistochemistry technique and its expression quantification should not be disseminated in two languages—that of the pathologist and another of clinician or surgeon. The scientific, diagnostic and prognostic applications of this methodology must be explored in a bid to benefit of patient. In order to achieve this goal a collaboration and pooling of knowledge from both of these valuable medical areas is vital PMID:20212918

  19. Mathematical and Computational Foundations of Recurrence Quantifications

    NASA Astrophysics Data System (ADS)

    Marwan, Norbert; Webber, Charles L.

    Real-world systems possess deterministic trajectories, phase singularities and noise. Dynamic trajectories have been studied in temporal and frequency domains, but these are linear approaches. Basic to the field of nonlinear dynamics is the representation of trajectories in phase space. A variety of nonlinear tools such as the Lyapunov exponent, Kolmogorov-Sinai entropy, correlation dimension, etc. have successfully characterized trajectories in phase space, provided the systems studied were stationary in time. Ubiquitous in nature, however, are systems that are nonlinear and nonstationary, existing in noisy environments all of which are assumption breaking to otherwise powerful linear tools. What has been unfolding over the last quarter of a century, however, is the timely discovery and practical demonstration that the recurrences of system trajectories in phase space can provide important clues to the system designs from which they derive. In this chapter we will introduce the basics of recurrence plots (RP) and their quantification analysis (RQA). We will begin by summarizing the concept of phase space reconstructions. Then we will provide the mathematical underpinnings of recurrence plots followed by the details of recurrence quantifications. Finally, we will discuss computational approaches that have been implemented to make recurrence strategies feasible and useful. As computers become faster and computer languages advance, younger generations of researchers will be stimulated and encouraged to capture nonlinear recurrence patterns and quantification in even better formats. This particular branch of nonlinear dynamics remains wide open for the definition of new recurrence variables and new applications untouched to date.

  20. Use of a medication quantification scale for comparison of pain medication usage in patients with complex regional pain syndrome (CRPS).

    PubMed

    Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman

    2015-03-01

    To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P < 0.002), 10.389 (P < 0.001), and 4.984 (P = 0.303), respectively. There appears to be only a weak correlation between amount of pain medication prescribed and patients' reported subjective pain intensity within this limited patient population. The Medication Quantification Scale is a viable tool for the analysis of pharmaceutical treatment of CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.

  1. Influence of Co-57 and CT Transmission Measurements on the Quantification Accuracy and Partial Volume Effect of a Small Animal PET Scanner.

    PubMed

    Mannheim, Julia G; Schmid, Andreas M; Pichler, Bernd J

    2017-12-01

    Non-invasive in vivo positron emission tomography (PET) provides high detection sensitivity in the nano- to picomolar range and in addition to other advantages, the possibility to absolutely quantify the acquired data. The present study focuses on the comparison of transmission data acquired with an X-ray computed tomography (CT) scanner or a Co-57 source for the Inveon small animal PET scanner (Siemens Healthcare, Knoxville, TN, USA), as well as determines their influences on the quantification accuracy and partial volume effect (PVE). A special focus included the impact of the performed calibration on the quantification accuracy. Phantom measurements were carried out to determine the quantification accuracy, the influence of the object size on the quantification, and the PVE for different sphere sizes, along the field of view and for different contrast ratios. An influence of the emission activity on the Co-57 transmission measurements was discovered (deviations up to 24.06 % measured to true activity), whereas no influence of the emission activity on the CT attenuation correction was identified (deviations <3 % for measured to true activity). The quantification accuracy was substantially influenced by the applied calibration factor and by the object size. The PVE demonstrated a dependency on the sphere size, the position within the field of view, the reconstruction and correction algorithms and the count statistics. Depending on the reconstruction algorithm, only ∼30-40 % of the true activity within a small sphere could be resolved. The iterative 3D reconstruction algorithms uncovered substantially increased recovery values compared to the analytical and 2D iterative reconstruction algorithms (up to 70.46 % and 80.82 % recovery for the smallest and largest sphere using iterative 3D reconstruction algorithms). The transmission measurement (CT or Co-57 source) to correct for attenuation did not severely influence the PVE. The analysis of the quantification accuracy and the PVE revealed an influence of the object size, the reconstruction algorithm and the applied corrections. Particularly, the influence of the emission activity during the transmission measurement performed with a Co-57 source must be considered. To receive comparable results, also among different scanner configurations, standardization of the acquisition (imaging parameters, as well as applied reconstruction and correction protocols) is necessary.

  2. Application of Stochastic Labeling with Random-Sequence Barcodes for Simultaneous Quantification and Sequencing of Environmental 16S rRNA Genes.

    PubMed

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2017-01-01

    Next-generation sequencing (NGS) is a powerful tool for analyzing environmental DNA and provides the comprehensive molecular view of microbial communities. For obtaining the copy number of particular sequences in the NGS library, however, additional quantitative analysis as quantitative PCR (qPCR) or digital PCR (dPCR) is required. Furthermore, number of sequences in a sequence library does not always reflect the original copy number of a target gene because of biases caused by PCR amplification, making it difficult to convert the proportion of particular sequences in the NGS library to the copy number using the mass of input DNA. To address this issue, we applied stochastic labeling approach with random-tag sequences and developed a NGS-based quantification protocol, which enables simultaneous sequencing and quantification of the targeted DNA. This quantitative sequencing (qSeq) is initiated from single-primer extension (SPE) using a primer with random tag adjacent to the 5' end of target-specific sequence. During SPE, each DNA molecule is stochastically labeled with the random tag. Subsequently, first-round PCR is conducted, specifically targeting the SPE product, followed by second-round PCR to index for NGS. The number of random tags is only determined during the SPE step and is therefore not affected by the two rounds of PCR that may introduce amplification biases. In the case of 16S rRNA genes, after NGS sequencing and taxonomic classification, the absolute number of target phylotypes 16S rRNA gene can be estimated by Poisson statistics by counting random tags incorporated at the end of sequence. To test the feasibility of this approach, the 16S rRNA gene of Sulfolobus tokodaii was subjected to qSeq, which resulted in accurate quantification of 5.0 × 103 to 5.0 × 104 copies of the 16S rRNA gene. Furthermore, qSeq was applied to mock microbial communities and environmental samples, and the results were comparable to those obtained using digital PCR and relative abundance based on a standard sequence library. We demonstrated that the qSeq protocol proposed here is advantageous for providing less-biased absolute copy numbers of each target DNA with NGS sequencing at one time. By this new experiment scheme in microbial ecology, microbial community compositions can be explored in more quantitative manner, thus expanding our knowledge of microbial ecosystems in natural environments.

  3. Automatic 3D segmentation of multiphoton images: a key step for the quantification of human skin.

    PubMed

    Decencière, Etienne; Tancrède-Bohin, Emmanuelle; Dokládal, Petr; Koudoro, Serge; Pena, Ana-Maria; Baldeweck, Thérèse

    2013-05-01

    Multiphoton microscopy has emerged in the past decade as a useful noninvasive imaging technique for in vivo human skin characterization. However, it has not been used until now in evaluation clinical trials, mainly because of the lack of specific image processing tools that would allow the investigator to extract pertinent quantitative three-dimensional (3D) information from the different skin components. We propose a 3D automatic segmentation method of multiphoton images which is a key step for epidermis and dermis quantification. This method, based on the morphological watershed and graph cuts algorithms, takes into account the real shape of the skin surface and of the dermal-epidermal junction, and allows separating in 3D the epidermis and the superficial dermis. The automatic segmentation method and the associated quantitative measurements have been developed and validated on a clinical database designed for aging characterization. The segmentation achieves its goals for epidermis-dermis separation and allows quantitative measurements inside the different skin compartments with sufficient relevance. This study shows that multiphoton microscopy associated with specific image processing tools provides access to new quantitative measurements on the various skin components. The proposed 3D automatic segmentation method will contribute to build a powerful tool for characterizing human skin condition. To our knowledge, this is the first 3D approach to the segmentation and quantification of these original images. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  4. Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus

    PubMed Central

    Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda

    2018-01-01

    Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130

  5. Characterizing stroke lesions using digital templates and lesion quantification tools in a web-based imaging informatics system for a large-scale stroke rehabilitation clinical trial

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Edwardson, Matthew; Dromerick, Alexander; Winstein, Carolee; Wang, Jing; Liu, Brent

    2015-03-01

    Previously, we presented an Interdisciplinary Comprehensive Arm Rehabilitation Evaluation (ICARE) imaging informatics system that supports a large-scale phase III stroke rehabilitation trial. The ePR system is capable of displaying anonymized patient imaging studies and reports, and the system is accessible to multiple clinical trial sites and users across the United States via the web. However, the prior multicenter stroke rehabilitation trials lack any significant neuroimaging analysis infrastructure. In stroke related clinical trials, identification of the stroke lesion characteristics can be meaningful as recent research shows that lesion characteristics are related to stroke scale and functional recovery after stroke. To facilitate the stroke clinical trials, we hope to gain insight into specific lesion characteristics, such as vascular territory, for patients enrolled into large stroke rehabilitation trials. To enhance the system's capability for data analysis and data reporting, we have integrated new features with the system: a digital brain template display, a lesion quantification tool and a digital case report form. The digital brain templates are compiled from published vascular territory templates at each of 5 angles of incidence. These templates were updated to include territories in the brainstem using a vascular territory atlas and the Medical Image Processing, Analysis and Visualization (MIPAV) tool. The digital templates are displayed for side-by-side comparisons and transparent template overlay onto patients' images in the image viewer. The lesion quantification tool quantifies planimetric lesion area from user-defined contour. The digital case report form stores user input into a database, then displays contents in the interface to allow for reviewing, editing, and new inputs. In sum, the newly integrated system features provide the user with readily-accessible web-based tools to identify the vascular territory involved, estimate lesion area, and store these results in a web-based digital format.

  6. Clarifying uncertainty in biogeochemical response to land management

    NASA Astrophysics Data System (ADS)

    Tonitto, C.; Gurwick, N. P.; Woodbury, P. B.

    2013-12-01

    We examined the ability of contemporary simulation and empirical modeling tools to describe net greenhouse gas (GHG) emissions as a result of agricultural and forest ecosystem land management, and we looked at how key policy institutions use these tools. We focused on quantification of nitrous oxide (N2O) emissions from agricultural systems, as agriculture is the dominant source of anthropogenic N2O emissions. Agricultural management impact on N2O emissions is especially challenging because controls on N2O emissions (soil aerobic status, inorganic N availability, and C substrate availability) vary as a function of site soil type, climate, and cropping system; available measurements do not cover all relevant combinations of these controlling system features. Furthermore, N2O emissions are highly non-linear, and threshold values of controlling soil environmental conditions are not defined across most agricultural site properties. We also examined the multi-faceted challenges regarding the quantification of increased soil organic carbon (SOC) storage as a result of land management in both agricultural and forest systems. Quantifying changes in SOC resulting from land management is difficult because mechanisms of SOC stabilization are not fully understood, SOC measurements have been concentrated in the upper 30cm of soil, erosion is often ignored when estimating SOC, and few long-term studies exist to track system response to diverse management practices. Furthermore, the permanence of SOC accumulating management practices is not easily established. For instance, under the Regional Greenhouse Gas Initiative (RGGI), forest land managed for SOC accumulation must remain under permanent conservation easement to ensure that SOC accumulation is not reversed due to changes in land cover. For agricultural protocols, given that many farmers rent land and that agriculture is driven by an annual management time scale, the ability to ensure SOC-accumulating land management would be maintained indefinitely has delayed the implementation of SOC accumulating practices for compliance with the California Global Warming Solutions Act (AB 32). GHG accounting tools are increasingly applied to implement GHG reduction policies. In this policy context, data limitation has impacted the implementation of GHG accounting strategies. For example, protocol design in support of AB 32 initially sought to apply simulation models to determine N2O emissions across all major U.S. agricultural landscapes. After discussion with ecosystem scientists, the lack of observations and model validation in most U.S. arable landscapes led to protocol definition based on simple empirical models and limited to corn management in 12 states. The distribution of protocol participants is also a potential source of inaccuracy in GHG accounting. Land management protocols are often structured assuming that in the aggregate policy achieves an average improvement by promoting specific management. However it is unclear that current policy incentives promote participation from a truly random distribution of landscapes. Participation in policy development to support improved land management challenges ecosystem scientists with making recommendations based on best-available information while acknowledging that uncertainty limits accurate quantification of impacts via analysis using either observations or simulation modeling.

  7. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  8. Quantitative detection of Moraxella catarrhalis in nasopharyngeal secretions by real-time PCR.

    PubMed

    Greiner, Oliver; Day, Philip J R; Altwegg, Martin; Nadal, David

    2003-04-01

    The recognition of Moraxella catarrhalis as an important cause of respiratory tract infections has been protracted, mainly because it is a frequent commensal organism of the upper respiratory tract and the diagnostic sensitivity of blood or pleural fluid culture is low. Given that the amount of M. catarrhalis bacteria in the upper respiratory tract may change during infection, quantification of these bacteria in nasopharyngeal secretions (NPSs) by real-time PCR may offer a suitable diagnostic approach. Using primers and a fluorescent probe specific for the copB outer membrane protein gene, we detected DNA from serial dilutions of M. catarrhalis cells corresponding to 1 to 10(6) cells. Importantly, there was no difference in the amplification efficiency when the same DNA was mixed with DNA from NPSs devoid of M. catarrhalis. The specificity of the reaction was further confirmed by the lack of amplification of DNAs from other Moraxella species, nontypeable Haemophilus influenzae, H. influenzae type b, Streptococcus pneumoniae, Streptococcus oralis, Streptococcus pyogenes, Bordetella pertussis, Corynebacterium diphtheriae, and various Neisseria species. The assay applied to NPSs from 184 patients with respiratory tract infections performed with a sensitivity of 100% and a specificity of up to 98% compared to the culture results. The numbers of M. catarrhalis organisms detected by real-time PCR correlated with the numbers detected by semiquantitative culture. This real-time PCR assay targeting the copB outer membrane protein gene provided a sensitive and reliable means for the rapid detection and quantification of M. catarrhalis in NPSs; may serve as a tool to study changes in the amounts of M. catarrhalis during lower respiratory tract infections or following vaccination against S. pneumoniae, H. influenzae, or N. meningitidis; and may be applied to other clinical samples.

  9. Quantitative Detection of Moraxella catarrhalis in Nasopharyngeal Secretions by Real-Time PCR

    PubMed Central

    Greiner, Oliver; Day, Philip J. R.; Altwegg, Martin; Nadal, David

    2003-01-01

    The recognition of Moraxella catarrhalis as an important cause of respiratory tract infections has been protracted, mainly because it is a frequent commensal organism of the upper respiratory tract and the diagnostic sensitivity of blood or pleural fluid culture is low. Given that the amount of M. catarrhalis bacteria in the upper respiratory tract may change during infection, quantification of these bacteria in nasopharyngeal secretions (NPSs) by real-time PCR may offer a suitable diagnostic approach. Using primers and a fluorescent probe specific for the copB outer membrane protein gene, we detected DNA from serial dilutions of M. catarrhalis cells corresponding to 1 to 106 cells. Importantly, there was no difference in the amplification efficiency when the same DNA was mixed with DNA from NPSs devoid of M. catarrhalis. The specificity of the reaction was further confirmed by the lack of amplification of DNAs from other Moraxella species, nontypeable Haemophilus influenzae, H. influenzae type b, Streptococcus pneumoniae, Streptococcus oralis, Streptococcus pyogenes, Bordetella pertussis, Corynebacterium diphtheriae, and various Neisseria species. The assay applied to NPSs from 184 patients with respiratory tract infections performed with a sensitivity of 100% and a specificity of up to 98% compared to the culture results. The numbers of M. catarrhalis organisms detected by real-time PCR correlated with the numbers detected by semiquantitative culture. This real-time PCR assay targeting the copB outer membrane protein gene provided a sensitive and reliable means for the rapid detection and quantification of M. catarrhalis in NPSs; may serve as a tool to study changes in the amounts of M. catarrhalis during lower respiratory tract infections or following vaccination against S. pneumoniae, H. influenzae, or N. meningitidis; and may be applied to other clinical samples. PMID:12682118

  10. Novel quantitative real-time LCR for the sensitive detection of SNP frequencies in pooled DNA: method development, evaluation and application.

    PubMed

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-19

    Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food.

  11. Novel Quantitative Real-Time LCR for the Sensitive Detection of SNP Frequencies in Pooled DNA: Method Development, Evaluation and Application

    PubMed Central

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-01

    Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808

  12. Quantification of Kryptofix 2.2.2 in [18F]fluorine-labelled radiopharmaceuticals by rapid-resolution liquid chromatography.

    PubMed

    Lao, Yexing; Yang, Cuiping; Zou, Wei; Gan, Manquan; Chen, Ping; Su, Weiwei

    2012-05-01

    The cryptand Kryptofix 2.2.2 is used extensively as a phase-transfer reagent in the preparation of [18F]fluoride-labelled radiopharmaceuticals. However, it has considerable acute toxicity. The aim of this study was to develop and validate a method for rapid (within 1 min), specific and sensitive quantification of Kryptofix 2.2.2 at trace levels. Chromatographic separations were carried out by rapid-resolution liquid chromatography (Agilent ZORBAX SB-C18 rapid-resolution column, 2.1 × 30 mm, 3.5 μm). Tandem mass spectra were acquired using a triple quadrupole mass spectrometer equipped with an electrospray ionization interface. Quantitative mass spectrometric analysis was conducted in positive ion mode and multiple reaction monitoring mode for the m/z 377.3 → 114.1 transition for Kryptofix 2.2.2. The external standard method was used for quantification. The method met the precision and efficiency requirements for PET radiopharmaceuticals, providing satisfactory results for specificity, matrix effect, stability, linearity (0.5-100 ng/ml, r(2)=0.9975), precision (coefficient of variation < 5%), accuracy (relative error < ± 3%), sensitivity (lower limit of quantification=0.5 ng) and detection time (<1 min). Fluorodeoxyglucose (n=6) was analysed, and the Kryptofix 2.2.2 content was found to be well below the maximum permissible levels approved by the US Food and Drug Administration. The developed method has a short analysis time (<1 min) and high sensitivity (lower limit of quantification=0.5 ng/ml) and can be successfully applied to rapid quantification of Kryptofix 2.2.2 at trace levels in fluorodeoxyglucose. This method could also be applied to other [18F]fluorine-labelled radiopharmaceuticals that use Kryptofix 2.2.2 as a phase-transfer reagent.

  13. Quantification of polyhydroxyalkanoates in mixed and pure cultures biomass by Fourier transform infrared spectroscopy: comparison of different approaches.

    PubMed

    Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J

    2016-08-01

    Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.

  14. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  15. Optical quantification of forces at play during stem cell differentiation

    NASA Astrophysics Data System (ADS)

    Ritter, Christine M.; Brickman, Joshua M.; Oddershede, Lene B.

    2016-03-01

    A cell is in constant interaction with its environment, it responds to external mechanical, chemical and biological signals. The response to these signals can be of various nature, for instance intra-cellular mechanical re-arrangements, cell-cell interactions, or cellular reinforcements. Optical methods are quite attractive for investigating the mechanics inside living cells as, e.g., optical traps are amongst the only nanotools that can reach and manipulate, measure forces, inside a living cell. In the recent years it has become increasingly evident that not only biochemical and biomolecular cues, but also that mechanical ones, play an important roles in stem cell differentiation. The first evidence for the importance of mechanical cues emerged from studies showing that substrate stiffness had an impact on stem cell differentiation. Recently, techniques such as optical tweezers and stretchers have been applied to stem cells, producing new insights into the role of mechanics in regulating renewal and differentiation. Here, we describe how optical tweezers and optical stretchers can be applied as a tool to investigate stem cell mechanics and some of the recent results to come out of this work.

  16. A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy

    NASA Astrophysics Data System (ADS)

    Bennun, Leonardo

    2017-07-01

    A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied

  17. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    NASA Astrophysics Data System (ADS)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  18. UQTk Version 3.0.3 User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sargsyan, Khachik; Safta, Cosmin; Chowdhary, Kamaljit Singh

    2017-05-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  19. Simple tool for the rapid, automated quantification of glacier advance/retreat observations using multiple methods

    NASA Astrophysics Data System (ADS)

    Lea, J.

    2017-12-01

    The quantification of glacier change is a key variable within glacier monitoring, with the method used potentially being crucial to ensuring that data can be appropriately compared with environmental data. The topic and timescales of study (e.g. land/marine terminating environments; sub-annual/decadal/centennial/millennial timescales) often mean that different methods are more suitable for different problems. However, depending on the GIS/coding expertise of the user, some methods can potentially be time consuming to undertake, making large-scale studies problematic. In addition, examples exist where different users have nominally applied the same methods in different studies, though with minor methodological inconsistencies in their approach. In turn, this will have implications for data homogeneity where regional/global datasets may be constructed. Here, I present a simple toolbox scripted in a Matlab® environment that requires only glacier margin and glacier centreline data to quantify glacier length, glacier change between observations, rate of change, in addition to other metrics. The toolbox includes the option to apply the established centreline or curvilinear box methods, or a new method: the variable box method - designed for tidewater margins where box width is defined as the total width of the individual terminus observation. The toolbox is extremely flexible, and has the option to be applied as either Matlab® functions within user scripts, or via a graphical user interface (GUI) for those unfamiliar with a coding environment. In both instances, there is potential to apply the methods quickly to large datasets (100s-1000s of glaciers, with potentially similar numbers of observations each), thus ensuring large scale methodological consistency (and therefore data homogeneity) and allowing regional/global scale analyses to be achievable for those with limited GIS/coding experience. The toolbox has been evaluated against idealised scenarios demonstrating its accuracy, while feedback from undergraduate students who have trialled the toolbox is that it is intuitive and simple to use. When released, the toolbox will be free and open source allowing users to potentially modify, improve and expand upon the current version.

  20. Development and validation of an event-specific quantitative PCR method for genetically modified maize MIR162.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2014-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.

  1. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  2. Method development towards qualitative and semi-quantitative analysis of multiple pesticides from food surfaces and extracts by desorption electrospray ionization mass spectrometry as a preselective tool for food control.

    PubMed

    Gerbig, Stefanie; Stern, Gerold; Brunn, Hubertus E; Düring, Rolf-Alexander; Spengler, Bernhard; Schulz, Sabine

    2017-03-01

    Direct analysis of fruit and vegetable surfaces is an important tool for in situ detection of food contaminants such as pesticides. We tested three different ways to prepare samples for the qualitative desorption electrospray ionization mass spectrometry (DESI-MS) analysis of 32 pesticides found on nine authentic fruits collected from food control. Best recovery rates for topically applied pesticides (88%) were found by analyzing the surface of a glass slide which had been rubbed against the surface of the food. Pesticide concentration in all samples was at or below the maximum residue level allowed. In addition to the high sensitivity of the method for qualitative analysis, quantitative or, at least, semi-quantitative information is needed in food control. We developed a DESI-MS method for the simultaneous determination of linear calibration curves of multiple pesticides of the same chemical class using normalization to one internal standard (ISTD). The method was first optimized for food extracts and subsequently evaluated for the quantification of pesticides in three authentic food extracts. Next, pesticides and the ISTD were applied directly onto food surfaces, and the corresponding calibration curves were obtained. The determination of linear calibration curves was still feasible, as demonstrated for three different food surfaces. This proof-of-principle method was used to simultaneously quantify two pesticides on an authentic sample, showing that the method developed could serve as a fast and simple preselective tool for disclosure of pesticide regulation violations. Graphical Abstract Multiple pesticide residues were detected and quantified in-situ from an authentic set of food items and extracts in a proof of principle study.

  3. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knio, Omar M.

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather inputmore » in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.« less

  4. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  5. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Source separation on hyperspectral cube applied to dermatology

    NASA Astrophysics Data System (ADS)

    Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.

    2010-03-01

    This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.

  7. Quantification of trans-1,4-polyisoprene in Eucommia ulmoides by fourier transform infrared spectroscopy and pyrolysis-gas chromatography/mass spectrometry.

    PubMed

    Takeno, Shinya; Bamba, Takeshi; Nakazawa, Yoshihisa; Fukusaki, Eiichiro; Okazawa, Atsushi; Kobayashi, Akio

    2008-04-01

    Commercial development of trans-1,4-polyisoprene from Eucommia ulmoides Oliver (EU-rubber) requires specific knowledge on selection of high-rubber-content lines and establishment of agronomic cultivation methods for achieving maximum EU-rubber yield. The development can be facilitated by high-throughput and highly sensitive analytical techniques for EU-rubber extraction and quantification. In this paper, we described an efficient EU-rubber extraction method, and validated that the accuracy was equivalent to that of the conventional Soxhlet extraction method. We also described a highly sensitive quantification method for EU-rubber by Fourier transform infrared spectroscopy (FT-IR) and pyrolysis-gas chromatography/mass spectrometry (PyGC/MS). We successfully applied the extraction/quantification method for study of seasonal changes in EU-rubber content and molecular weight distribution.

  8. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  9. A reversible fluorescent probe for real-time live-cell imaging and quantification of endogenous hydropolysulfides.

    PubMed

    Umezawa, Keitaro; Kamiya, Mako; Urano, Yasuteru

    2018-05-23

    The chemical biology of reactive sulfur species, including hydropolysulfides, has been a subject undergoing intense study in recent years, but further understanding of their 'intact' function in living cells has been limited due to a lack of appropriate analytical tools. In order to overcome this limitation, we developed a new type of fluorescent probe which reversibly and selectively reacts to hydropolysulfides. The probe enables live-cell visualization and quantification of endogenous hydropolysulfides without interference from intrinsic thiol species such as glutathione. Additionally, real-time reversible monitoring of oxidative-stress-induced fluctuation of intrinsic hydropolysulfides has been achieved with a temporal resolution in the order of seconds, a result which has not yet been realized using conventional methods. These results reveal the probe's versatility as a new fluorescence imaging tool to understand the function of intracellular hydropolysulfides. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Estimation of construction and demolition waste volume generation in new residential buildings in Spain.

    PubMed

    Villoria Sáez, Paola; del Río Merino, Mercedes; Porras-Amores, César

    2012-02-01

    The management planning of construction and demolition (C&D) waste uses a single indicator which does not provide enough detailed information. Therefore the determination and implementation of other innovative and precise indicators should be determined. The aim of this research work is to improve existing C&D waste quantification tools in the construction of new residential buildings in Spain. For this purpose, several housing projects were studied to determine an estimation of C&D waste generated during their construction process. This paper determines the values of three indicators to estimate the generation of C&D waste in new residential buildings in Spain, itemizing types of waste and construction stages. The inclusion of two more accurate indicators, in addition to the global one commonly in use, provides a significant improvement in C&D waste quantification tools and management planning.

  11. Skeletal Muscle Ultrasound in Critical Care: A Tool in Need of Translation.

    PubMed

    Mourtzakis, Marina; Parry, Selina; Connolly, Bronwen; Puthucheary, Zudin

    2017-10-01

    With the emerging interest in documenting and understanding muscle atrophy and function in critically ill patients and survivors, ultrasonography has transformational potential for measurement of muscle quantity and quality. We discuss the importance of quantifying skeletal muscle in the intensive care unit setting. We also identify the merits and limitations of various modalities that are capable of accurately and precisely measuring muscularity. Ultrasound is emerging as a potentially powerful tool for skeletal muscle quantification; however, there are key challenges that need to be addressed in future work to ensure useful interpretation and comparability of results across diverse observational and interventional studies. Ultrasound presents several methodological challenges, and ultimately muscle quantification combined with metabolic, nutritional, and functional markers will allow optimal patient assessment and prognosis. Moving forward, we recommend that publications include greater detail on landmarking, repeated measures, identification of muscle that was not assessable, and reproducible protocols to more effectively compare results across different studies.

  12. Use of multiple competitors for quantification of human immunodeficiency virus type 1 RNA in plasma.

    PubMed

    Vener, T; Nygren, M; Andersson, A; Uhlén, M; Albert, J; Lundeberg, J

    1998-07-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens.

  13. Quantification of fossil fuel CO2 emissions on the building/street scale for a large U.S. city.

    PubMed

    Gurney, Kevin R; Razlivanov, Igor; Song, Yang; Zhou, Yuyu; Benes, Bedrich; Abdul-Massih, Michel

    2012-11-06

    In order to advance the scientific understanding of carbon exchange with the land surface, build an effective carbon monitoring system, and contribute to quantitatively based U.S. climate change policy interests, fine spatial and temporal quantification of fossil fuel CO(2) emissions, the primary greenhouse gas, is essential. Called the "Hestia Project", this research effort is the first to use bottom-up methods to quantify all fossil fuel CO(2) emissions down to the scale of individual buildings, road segments, and industrial/electricity production facilities on an hourly basis for an entire urban landscape. Here, we describe the methods used to quantify the on-site fossil fuel CO(2) emissions across the city of Indianapolis, IN. This effort combines a series of data sets and simulation tools such as a building energy simulation model, traffic data, power production reporting, and local air pollution reporting. The system is general enough to be applied to any large U.S. city and holds tremendous potential as a key component of a carbon-monitoring system in addition to enabling efficient greenhouse gas mitigation and planning. We compare the natural gas component of our fossil fuel CO(2) emissions estimate to consumption data provided by the local gas utility. At the zip code level, we achieve a bias-adjusted Pearson r correlation value of 0.92 (p < 0.001).

  14. New approaches for the standardization and validation of a real-time qPCR assay using TaqMan probes for quantification of yellow fever virus on clinical samples with high quality parameters

    PubMed Central

    Fernandes-Monteiro, Alice G; Trindade, Gisela F; Yamamura, Anna MY; Moreira, Otacilio C; de Paula, Vanessa S; Duarte, Ana Cláudia M; Britto, Constança; Lima, Sheila Maria B

    2015-01-01

    The development and production of viral vaccines, in general, involve several steps that need the monitoring of viral load throughout the entire process. Applying a 2-step quantitative reverse transcription real time PCR assay (RT-qPCR), viral load can be measured and monitored in a few hours. In this context, the development, standardization and validation of a RT-qPCR test to quickly and efficiently quantify yellow fever virus (YFV) in all stages of vaccine production are extremely important. To serve this purpose we used a plasmid construction containing the NS5 region from 17DD YFV to generate the standard curve and to evaluate parameters such as linearity, precision and specificity against other flavivirus. Furthermore, we defined the limits of detection as 25 copies/reaction, and quantification as 100 copies/reaction for the test. To ensure the quality of the method, reference controls were established in order to avoid false negative results. The qRT-PCR technique based on the use of TaqMan probes herein standardized proved to be effective for determining yellow fever viral load both in vivo and in vitro, thus becoming a very important tool to assure the quality control for vaccine production and evaluation of viremia after vaccination or YF disease. PMID:26011746

  15. A droplet microfluidics platform for rapid microalgal growth and oil production analysis.

    PubMed

    Kim, Hyun Soo; Guzman, Adrian R; Thapa, Hem R; Devarenne, Timothy P; Han, Arum

    2016-08-01

    Microalgae have emerged as a promising source for producing future renewable biofuels. Developing better microalgal strains with faster growth and higher oil production rates is one of the major routes towards economically viable microalgal biofuel production. In this work, we present a droplet microfluidics-based microalgae analysis platform capable of measuring growth and oil content of various microalgal strains with single-cell resolution in a high-throughput manner. The platform allows for encapsulating a single microalgal cell into a water-in-oil emulsion droplet and tracking the growth and division of the encapsulated cell over time, followed by on-chip oil quantification. The key feature of the developed platform is its capability to fluorescently stain microalgae within microdroplets for oil content quantification. The performance of the developed platform was characterized using the unicellular microalga Chlamydomonas reinhardtii and the colonial microalga Botryococcus braunii. The application of the platform in quantifying growth and oil accumulation was successfully confirmed using C. reinhardtii under different culture conditions, namely nitrogen-replete and nitrogen-limited conditions. These results demonstrate the capability of this platform as a rapid screening tool that can be applied to a wide range of microalgal strains for analyzing growth and oil accumulation characteristics relevant to biofuel strain selection and development. Biotechnol. Bioeng. 2016;113: 1691-1701. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Visualization and Non-Destructive Quantification of Inkjet-Printed Pharmaceuticals on Different Substrates Using Raman Spectroscopy and Raman Chemical Imaging.

    PubMed

    Edinger, Magnus; Bar-Shalom, Daniel; Rantanen, Jukka; Genina, Natalja

    2017-05-01

    The purpose of this study was to investigate the applicability of Raman spectroscopy for visualization and quantification of inkjet-printed pharmaceuticals. Haloperidol was used as a model active pharmaceutical ingredient (API), and a printable ink base containing lactic acid and ethanol was developed. Inkjet printing technology was used to apply haloperidol ink onto three different substrates. Custom-made inorganic compacts and dry foam, as well as marketed paracetamol tablets were used as the substrates. Therapeutic personalized doses were printed by using one to ten printing rounds on the substrates. The haloperidol content in the finished dosage forms were determined by high-performance liquid chromatography (HPLC). The distribution of the haloperidol on the dosage forms were visualized using Raman chemical imaging combined with principal components analysis (PCA). Raman spectroscopy combined with modeling by partial least squares (PLS) regression was used for establishment of a quantitative model of the haloperidol content in the printed dosage forms. A good prediction of the haloperidol content was achieved for the inorganic compacts, while a slightly poorer prediction was observed for the paracetamol tablets. It was not possible to quantify haloperidol on the dry foam due to the low and varying density of the substrate. Raman spectroscopy is a useful tool for visualization and quality control of inkjet printed personalized medicine.

  17. New approaches for the standardization and validation of a real-time qPCR assay using TaqMan probes for quantification of yellow fever virus on clinical samples with high quality parameters.

    PubMed

    Fernandes-Monteiro, Alice G; Trindade, Gisela F; Yamamura, Anna M Y; Moreira, Otacilio C; de Paula, Vanessa S; Duarte, Ana Cláudia M; Britto, Constança; Lima, Sheila Maria B

    2015-01-01

    The development and production of viral vaccines, in general, involve several steps that need the monitoring of viral load throughout the entire process. Applying a 2-step quantitative reverse transcription real time PCR assay (RT-qPCR), viral load can be measured and monitored in a few hours. In this context, the development, standardization and validation of a RT-qPCR test to quickly and efficiently quantify yellow fever virus (YFV) in all stages of vaccine production are extremely important. To serve this purpose we used a plasmid construction containing the NS5 region from 17DD YFV to generate the standard curve and to evaluate parameters such as linearity, precision and specificity against other flavivirus. Furthermore, we defined the limits of detection as 25 copies/reaction, and quantification as 100 copies/reaction for the test. To ensure the quality of the method, reference controls were established in order to avoid false negative results. The qRT-PCR technique based on the use of TaqMan probes herein standardized proved to be effective for determining yellow fever viral load both in vivo and in vitro, thus becoming a very important tool to assure the quality control for vaccine production and evaluation of viremia after vaccination or YF disease.

  18. Biofilm development of an opportunistic model bacterium analysed at high spatiotemporal resolution in the framework of a precise flow cell

    PubMed Central

    Lim, Chun Ping; Mai, Phuong Nguyen Quoc; Roizman Sade, Dan; Lam, Yee Cheong; Cohen, Yehuda

    2016-01-01

    Life of bacteria is governed by the physical dimensions of life in microscales, which is dominated by fast diffusion and flow at low Reynolds numbers. Microbial biofilms are structurally and functionally heterogeneous and their development is suggested to be interactively related to their microenvironments. In this study, we were guided by the challenging requirements of precise tools and engineered procedures to achieve reproducible experiments at high spatial and temporal resolutions. Here, we developed a robust precise engineering approach allowing for the quantification of real-time, high-content imaging of biofilm behaviour under well-controlled flow conditions. Through the merging of engineering and microbial ecology, we present a rigorous methodology to quantify biofilm development at resolutions of single micrometre and single minute, using a newly developed flow cell. We designed and fabricated a high-precision flow cell to create defined and reproducible flow conditions. We applied high-content confocal laser scanning microscopy and developed image quantification using a model biofilm of a defined opportunistic strain, Pseudomonas putida OUS82. We observed complex patterns in the early events of biofilm formation, which were followed by total dispersal. These patterns were closely related to the flow conditions. These biofilm behavioural phenomena were found to be highly reproducible, despite the heterogeneous nature of biofilm. PMID:28721252

  19. Application of miniaturized near-infrared spectroscopy for quality control of extemporaneous orodispersible films.

    PubMed

    Foo, Wen Chin; Widjaja, Effendi; Khong, Yuet Mei; Gokhale, Rajeev; Chan, Sui Yung

    2018-02-20

    Extemporaneous oral preparations are routinely compounded in the pharmacy due to a lack of suitable formulations for special populations. Such small-scale pharmacy preparations also present an avenue for individualized pharmacotherapy. Orodispersible films (ODF) have increasingly been evaluated as a suitable dosage form for extemporaneous oral preparations. Nevertheless, as with all other extemporaneous preparations, safety and quality remain a concern. Although the United States Pharmacopeia (USP) recommends analytical testing of compounded preparations for quality assurance, pharmaceutical assays are typically not routinely performed for such non-sterile pharmacy preparations, due to the complexity and high cost of conventional assay methods such as high performance liquid chromatography (HPLC). Spectroscopic methods including Raman, infrared and near-infrared spectroscopy have been successfully applied as quality control tools in the industry. The state-of-art benchtop spectrometers used in those studies have the advantage of superior resolution and performance, but are not suitable for use in a small-scale pharmacy setting. In this study, we investigated the application of a miniaturized near infrared (NIR) spectrometer as a quality control tool for identification and quantification of drug content in extemporaneous ODFs. Miniaturized near infrared (NIR) spectroscopy is suitable for small-scale pharmacy applications in view of its small size, portability, simple user interface, rapid measurement and real-time prediction results. Nevertheless, the challenge with miniaturized NIR spectroscopy is its lower resolution compared to state-of-art benchtop equipment. We have successfully developed NIR spectroscopy calibration models for identification of ODFs containing five different drugs, and quantification of drug content in ODFs containing 2-10mg ondansetron (OND). The qualitative model for drug identification produced 100% prediction accuracy. The quantitative model to predict OND drug content in ODFs was divided into two calibrations for improved accuracy: Calibration I and II covered the 2-4mg and 4-10mg ranges respectively. Validation was performed for method accuracy, linearity and precision. In conclusion, this study demonstrates the feasibility of miniaturized NIR spectroscopy as a quality control tool for small-scale, pharmacy preparations. Due to its non-destructive nature, every dosage unit can be tested thus affording positive impact on patient safety. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  1. Development of a quantitative loop-mediated isothermal amplification assay for the field detection of Erysiphe necator.

    PubMed

    Thiessen, Lindsey D; Neill, Tara M; Mahaffee, Walter F

    2018-01-01

    Plant pathogen detection systems have been useful tools to monitor inoculum presence and initiate management schedules. More recently, a loop-mediated isothermal amplification (LAMP) assay was successfully designed for field use in the grape powdery mildew pathosystem; however, false negatives or false positives were prevalent in grower-conducted assays due to the difficulty in perceiving the magnesium pyrophosphate precipitate at low DNA concentrations. A quantitative LAMP (qLAMP) assay using a fluorescence resonance energy transfer-based probe was assessed by grape growers in the Willamette Valley of Oregon. Custom impaction spore samplers were placed at a research vineyard and six commercial vineyard locations, and were tested bi-weekly by the lab and by growers. Grower-conducted qLAMP assays used a beta-version of the Smart-DART handheld LAMP reaction devices (Diagenetix, Inc., Honolulu, HI, USA), connected to Android 4.4 enabled, Bluetooth-capable Nexus 7 tablets for output. Quantification by a quantitative PCR assay was assumed correct to compare the lab and grower qLAMP assay quantification. Growers were able to conduct and interpret qLAMP results; however, the Erysiphe necator inoculum quantification was unreliable using the beta-Smart-DART devices. The qLAMP assay developed was sensitive to one spore in early testing of the assay, but decreased to >20 spores by the end of the trial. The qLAMP assay is not likely a suitable management tool for grape powdery mildew due to losses in sensitivity and decreasing costs and portability for other, more reliable molecular tools.

  2. EpiProfile Quantifies Histone Peptides With Modifications by Extracting Retention Time and Intensity in High-resolution Mass Spectra*

    PubMed Central

    Yuan, Zuo-Fei; Lin, Shu; Molden, Rosalynn C.; Cao, Xing-Jun; Bhanu, Natarajan V.; Wang, Xiaoshi; Sidoli, Simone; Liu, Shichong; Garcia, Benjamin A.

    2015-01-01

    Histone post-translational modifications contribute to chromatin function through their chemical properties which influence chromatin structure and their ability to recruit chromatin interacting proteins. Nanoflow liquid chromatography coupled with high resolution tandem mass spectrometry (nanoLC-MS/MS) has emerged as the most suitable technology for global histone modification analysis because of the high sensitivity and the high mass accuracy of this approach that provides confident identification. However, analysis of histones with this method is even more challenging because of the large number and variety of isobaric histone peptides and the high dynamic range of histone peptide abundances. Here, we introduce EpiProfile, a software tool that discriminates isobaric histone peptides using the distinguishing fragment ions in their tandem mass spectra and extracts the chromatographic area under the curve using previous knowledge about peptide retention time. The accuracy of EpiProfile was evaluated by analysis of mixtures containing different ratios of synthetic histone peptides. In addition to label-free quantification of histone peptides, EpiProfile is flexible and can quantify different types of isotopically labeled histone peptides. EpiProfile is unique in generating layouts (i.e. relative retention time) of histone peptides when compared with manual quantification of the data and other programs (such as Skyline), filling the need of an automatic and freely available tool to quantify labeled and non-labeled modified histone peptides. In summary, EpiProfile is a valuable nanoflow liquid chromatography coupled with high resolution tandem mass spectrometry-based quantification tool for histone peptides, which can also be adapted to analyze nonhistone protein samples. PMID:25805797

  3. Quantification of groundwater recharge in urban environments.

    PubMed

    Tubau, Isabel; Vázquez-Suñé, Enric; Carrera, Jesús; Valhondo, Cristina; Criollo, Rotman

    2017-08-15

    Groundwater management in urban areas requires a detailed knowledge of the hydrogeological system as well as the adequate tools for predicting the amount of groundwater and water quality evolution. In that context, a key difference between urban and natural areas lies in recharge evaluation. A large number of studies have been published since the 1990s that evaluate recharge in urban areas, with no specific methodology. Most of these methods show that there are generally higher rates of recharge in urban settings than in natural settings. Methods such as mixing ratios or groundwater modeling can be used to better estimate the relative importance of different sources of recharge and may prove to be a good tool for total recharge evaluation. However, accurate evaluation of this input is difficult. The objective is to present a methodology to help overcome those difficulties, and which will allow us to quantify the variability in space and time of the recharge into aquifers in urban areas. Recharge calculations have been initially performed by defining and applying some analytical equations, and validation has been assessed based on groundwater flow and solute transport modeling. This methodology is applicable to complex systems by considering temporal variability of all water sources. This allows managers of urban groundwater to evaluate the relative contribution of different recharge sources at a city scale by considering quantity and quality factors. The methodology is applied to the assessment of recharge sources in the Barcelona city aquifers. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Predicting cell viability within tissue scaffolds under equiaxial strain: multi-scale finite element model of collagen-cardiomyocytes constructs.

    PubMed

    Elsaadany, Mostafa; Yan, Karen Chang; Yildirim-Ayan, Eda

    2017-06-01

    Successful tissue engineering and regenerative therapy necessitate having extensive knowledge about mechanical milieu in engineered tissues and the resident cells. In this study, we have merged two powerful analysis tools, namely finite element analysis and stochastic analysis, to understand the mechanical strain within the tissue scaffold and residing cells and to predict the cell viability upon applying mechanical strains. A continuum-based multi-length scale finite element model (FEM) was created to simulate the physiologically relevant equiaxial strain exposure on cell-embedded tissue scaffold and to calculate strain transferred to the tissue scaffold (macro-scale) and residing cells (micro-scale) upon various equiaxial strains. The data from FEM were used to predict cell viability under various equiaxial strain magnitudes using stochastic damage criterion analysis. The model validation was conducted through mechanically straining the cardiomyocyte-encapsulated collagen constructs using a custom-built mechanical loading platform (EQUicycler). FEM quantified the strain gradients over the radial and longitudinal direction of the scaffolds and the cells residing in different areas of interest. With the use of the experimental viability data, stochastic damage criterion, and the average cellular strains obtained from multi-length scale models, cellular viability was predicted and successfully validated. This methodology can provide a great tool to characterize the mechanical stimulation of bioreactors used in tissue engineering applications in providing quantification of mechanical strain and predicting cellular viability variations due to applied mechanical strain.

  5. NDE and SHM Simulation for CFRP Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Parker, F. Raymond

    2014-01-01

    Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.

  6. Meningococcal X polysaccharide quantification by high-performance anion-exchange chromatography using synthetic N-acetylglucosamine-4-phosphate as standard.

    PubMed

    Micoli, F; Adamo, R; Proietti, D; Gavini, M; Romano, M R; MacLennan, C A; Costantino, P; Berti, F

    2013-11-15

    A method for meningococcal X (MenX) polysaccharide quantification by high-performance anion-exchange chromatography with pulsed amperometric detection (HPAEC-PAD) is described. The polysaccharide is hydrolyzed by strong acidic treatment, and the peak of glucosamine-4-phosphate (4P-GlcN) is detected and measured after chromatography. In the selected conditions of hydrolysis, 4P-GlcN is the prevalent species formed, with GlcN detected for less than 5% in moles. As standard for the analysis, the monomeric unit of MenX polysaccharide, N-acetylglucosamine-4-phosphate (4P-GlcNAc), was used. This method for MenX quantification is highly selective and sensitive, and it constitutes an important analytical tool for the development of a conjugate vaccine against MenX. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Development and validation of a fast and simple multi-analyte procedure for quantification of 40 drugs relevant to emergency toxicology using GC-MS and one-point calibration.

    PubMed

    Meyer, Golo M J; Weber, Armin A; Maurer, Hans H

    2014-05-01

    Diagnosis and prognosis of poisonings should be confirmed by comprehensive screening and reliable quantification of xenobiotics, for example by gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-mass spectrometry (LC-MS). The turnaround time should be short enough to have an impact on clinical decisions. In emergency toxicology, quantification using full-scan acquisition is preferable because this allows screening and quantification of expected and unexpected drugs in one run. Therefore, a multi-analyte full-scan GC-MS approach was developed and validated with liquid-liquid extraction and one-point calibration for quantification of 40 drugs relevant to emergency toxicology. Validation showed that 36 drugs could be determined quickly, accurately, and reliably in the range of upper therapeutic to toxic concentrations. Daily one-point calibration with calibrators stored for up to four weeks reduced workload and turn-around time to less than 1 h. In summary, the multi-analyte approach with simple liquid-liquid extraction, GC-MS identification, and quantification over fast one-point calibration could successfully be applied to proficiency tests and real case samples. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Applied Imagistics of Ischaemic Heart a Survey. From the Epidemiology of Stable Angina In Order to Better Prevent Sudden Cardiac Death

    NASA Astrophysics Data System (ADS)

    Petruse, Radu Emanuil; Batâr, Sergiu; Cojan, Adela; Maniţiu, Ioan

    2014-11-01

    Coronary computed tomography angiography (CCTA) allows coronary artery visualization and the detection of coronary stenoses. In addition; it has been suggested as a novel, noninvasive modality for coronary atherosclerotic plaque detection, characterization, and quantification. Accurate identification of coronary plaques is challenging, especially for the noncalcified plaques, due to many factors such as the small size of coronary arteries, reconstruction artifacts caused by irregular heartbeats, beam hardening, and partial volume averaging. The development of 16, 32, 64 and the latest 320 row multidetector CT not only increases the spatial and the temporal resolution significantly, but also increases the number of images to be interpreted by radiologists substantially. Radiologists have to visually examine each coronary artery for suspicious stenosis using visualization tools such as multiplanar reformatting (MPR) and curved planar reformatting (CPR) provided by the review workstation in clinical practice

  9. Uncertainty Quantification in Simulations of Epidemics Using Polynomial Chaos

    PubMed Central

    Santonja, F.; Chen-Charpentier, B.

    2012-01-01

    Mathematical models based on ordinary differential equations are a useful tool to study the processes involved in epidemiology. Many models consider that the parameters are deterministic variables. But in practice, the transmission parameters present large variability and it is not possible to determine them exactly, and it is necessary to introduce randomness. In this paper, we present an application of the polynomial chaos approach to epidemiological mathematical models based on ordinary differential equations with random coefficients. Taking into account the variability of the transmission parameters of the model, this approach allows us to obtain an auxiliary system of differential equations, which is then integrated numerically to obtain the first-and the second-order moments of the output stochastic processes. A sensitivity analysis based on the polynomial chaos approach is also performed to determine which parameters have the greatest influence on the results. As an example, we will apply the approach to an obesity epidemic model. PMID:22927889

  10. The promise and challenge of high-throughput sequencing of the antibody repertoire

    PubMed Central

    Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R

    2014-01-01

    Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474

  11. A transfer matrix approach to vibration localization in mistuned blade assemblies

    NASA Technical Reports Server (NTRS)

    Ottarson, Gisli; Pierre, Chritophe

    1993-01-01

    A study of mode localization in mistuned bladed disks is performed using transfer matrices. The transfer matrix approach yields the free response of a general, mono-coupled, perfectly cyclic assembly in closed form. A mistuned structure is represented by random transfer matrices, and the expansion of these matrices in terms of the small mistuning parameter leads to the definition of a measure of sensitivity to mistuning. An approximation of the localization factor, the spatially averaged rate of exponential attenuation per blade-disk sector, is obtained through perturbation techniques in the limits of high and low sensitivity. The methodology is applied to a common model of a bladed disk and the results verified by Monte Carlo simulations. The easily calculated sensitivity measure may prove to be a valuable design tool due to its system-independent quantification of mistuning effects such as mode localization.

  12. Intravital assessment of myelin molecular order with polarimetric multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Turcotte, Raphaël; Rutledge, Danette J.; Bélanger, Erik; Dill, Dorothy; Macklin, Wendy B.; Côté, Daniel C.

    2016-08-01

    Myelin plays an essential role in the nervous system and its disruption in diseases such as multiple sclerosis may lead to neuronal death, thus causing irreversible functional impairments. Understanding myelin biology is therefore of fundamental and clinical importance, but no tools currently exist to describe the fine spatial organization of myelin sheaths in vivo. Here we demonstrate intravital quantification of the myelin molecular structure using a microscopy method based on polarization-resolved coherent Raman scattering. Developmental myelination was imaged noninvasively in live zebrafish. Longitudinal imaging of individual axons revealed changes in myelin organization beyond the diffraction limit. Applied to promyelination drug screening, the method uniquely enabled the identification of focal myelin regions with differential architectures. These observations indicate that the study of myelin biology and the identification of therapeutic compounds will largely benefit from a method to quantify the myelin molecular organization in vivo.

  13. Nanoscale monitoring of drug actions on cell membrane using atomic force microscopy

    PubMed Central

    Li, Mi; Liu, Lian-qing; Xi, Ning; Wang, Yue-chao

    2015-01-01

    Knowledge of the nanoscale changes that take place in individual cells in response to a drug is useful for understanding the drug action. However, due to the lack of adequate techniques, such knowledge was scarce until the advent of atomic force microscopy (AFM), which is a multifunctional tool for investigating cellular behavior with nanometer resolution under near-physiological conditions. In the past decade, researchers have applied AFM to monitor the morphological and mechanical dynamics of individual cells following drug stimulation, yielding considerable novel insight into how the drug molecules affect an individual cell at the nanoscale. In this article we summarize the representative applications of AFM in characterization of drug actions on cell membrane, including topographic imaging, elasticity measurements, molecular interaction quantification, native membrane protein imaging and manipulation, etc. The challenges that are hampering the further development of AFM for studies of cellular activities are aslo discussed. PMID:26027658

  14. Edinburgh Working Papers in Applied Linguistics, 1998.

    ERIC Educational Resources Information Center

    Parkinson, Brian, Ed.

    1998-01-01

    Papers on applied linguistics and language pedagogy include: "Non-Exact Quantification in Slide Presentations of Medical Research" (Ron Howard); "Modality and Point of View: A Contrastive Analysis of Japanese Wartime and Peacetime Newspaper Discourse" (Noriko Iwamoto); "Classroom Transcripts and 'Noticing' in Teacher Education" (Tony Lynch);…

  15. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies

    PubMed Central

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948

  17. Sustainable Urban Forestry Potential Based Quantitative And Qualitative Measurement Using Geospatial Technique

    NASA Astrophysics Data System (ADS)

    Rosli, A. Z.; Reba, M. N. M.; Roslan, N.; Room, M. H. M.

    2014-02-01

    In order to maintain the stability of natural ecosystems around urban areas, urban forestry will be the best initiative to maintain and control green space in our country. Integration between remote sensing (RS) and geospatial information system (GIS) serves as an effective tool for monitoring environmental changes and planning, managing and developing a sustainable urbanization. This paper aims to assess capability of the integration of RS and GIS to provide information for urban forest potential sites based on qualitative and quantitative by using priority parameter ranking in the new township of Nusajaya. SPOT image was used to provide high spatial accuracy while map of topography, landuse, soils group, hydrology, Digital Elevation Model (DEM) and soil series data were applied to enhance the satellite image in detecting and locating present attributes and features on the ground. Multi-Criteria Decision Making (MCDM) technique provides structural and pair wise quantification and comparison elements and criteria for priority ranking for urban forestry purpose. Slope, soil texture, drainage, spatial area, availability of natural resource, and vicinity of urban area are criteria considered in this study. This study highlighted the priority ranking MCDM is cost effective tool for decision-making in urban forestry planning and landscaping.

  18. Markov vs. Hurst-Kolmogorov behaviour identification in hydroclimatic processes

    NASA Astrophysics Data System (ADS)

    Dimitriadis, Panayiotis; Gournari, Naya; Koutsoyiannis, Demetris

    2016-04-01

    Hydroclimatic processes are usually modelled either by exponential decay of the autocovariance function, i.e., Markovian behaviour, or power type decay, i.e., long-term persistence (or else Hurst-Kolmogorov behaviour). For the identification and quantification of such behaviours several graphical stochastic tools can be used such as the climacogram (i.e., plot of the variance of the averaged process vs. scale), autocovariance, variogram, power spectrum etc. with the former usually exhibiting smaller statistical uncertainty as compared to the others. However, most methodologies including these tools are based on the expected value of the process. In this analysis, we explore a methodology that combines both the practical use of a graphical representation of the internal structure of the process as well as the statistical robustness of the maximum-likelihood estimation. For validation and illustration purposes, we apply this methodology to fundamental stochastic processes, such as Markov and Hurst-Kolmogorov type ones. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  19. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  20. Improving Soil Seed Bank Management.

    PubMed

    Haring, Steven C; Flessner, Michael L

    2018-05-08

    Problems associated with simplified weed management motivate efforts for diversification. Integrated weed management uses fundamentals of weed biology and applied ecology to provide a framework for diversified weed management programs; the soil seed bank comprises a necessary part of this framework. By targeting seeds, growers can inhibit the propagule pressure on which annual weeds depend for agricultural invasion. Some current management practices affect weed seed banks, such as crop rotation and tillage, but these tools are often used without specific intention to manage weed seeds. Difficulties quantifying the weed seed bank, understanding seed bank phenology, and linking seed banks to emerged weed communities challenge existing soil seed bank management practices. Improved seed bank quantification methods could include DNA profiling of the soil seed bank, mark and recapture, or 3D LIDAR mapping. Successful and sustainable soil seed bank management must constrain functionally diverse and changing weed communities. Harvest weed seed controls represent a step forward, but over-reliance on this singular technique could make it short-lived. Researchers must explore tools inspired by other pest management disciplines, such as gene drives or habitat modification for predatory organisms. Future weed seed bank management will combine multiple complementary practices that enhance diverse agroecosystems. This article is protected by copyright. All rights reserved.

  1. Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.

    PubMed

    Heredia, Nicholas J

    2018-01-01

    Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.

  2. A novel strategy using MASCOT Distiller for analysis of cleavable isotope-coded affinity tag data to quantify protein changes in plasma.

    PubMed

    Leung, Kit-Yi; Lescuyer, Pierre; Campbell, James; Byers, Helen L; Allard, Laure; Sanchez, Jean-Charles; Ward, Malcolm A

    2005-08-01

    A novel strategy consisting of cleavable Isotope-Coded Affinity Tag (cICAT) combined with MASCOT Distiller was evaluated as a tool for the quantification of proteins in "abnormal" patient plasma, prepared by pooling samples from patients with acute stroke. Quantification of all light and heavy cICAT-labelled peptide ion pairs was obtained using MASCOT Distiller combined with a proprietary software. Peptides displaying differences were selected for identification by MS. These preliminary results show the promise of our approach to identify potential biomarkers.

  3. Simultaneous digital quantification and fluorescence-based size characterization of massively parallel sequencing libraries.

    PubMed

    Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H

    2013-08-01

    Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.

  4. Good quantification practices of flavours and fragrances by mass spectrometry.

    PubMed

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  5. LC-MS/MS quantification of next-generation biotherapeutics: a case study for an IgE binding Nanobody in cynomolgus monkey plasma.

    PubMed

    Sandra, Koen; Mortier, Kjell; Jorge, Lucie; Perez, Luis C; Sandra, Pat; Priem, Sofie; Poelmans, Sofie; Bouche, Marie-Paule

    2014-05-01

    Nanobodies(®) are therapeutic proteins derived from the smallest functional fragments of heavy chain-only antibodies. The development and validation of an LC-MS/MS-based method for the quantification of an IgE binding Nanobody in cynomolgus monkey plasma is presented. Nanobody quantification was performed making use of a proteotypic tryptic peptide chromatographically enriched prior to LC-MS/MS analysis. The validated LLOQ at 36 ng/ml was measured with an intra- and inter-assay precision and accuracy <20%. The required sensitivity could be obtained based on the selectivity of 2D LC combined with MS/MS. No analyte specific tools for affinity purification were used. Plasma samples originating from a PK/PD study were analyzed and compared with the results obtained with a traditional ligand-binding assay. Excellent correlations between the two techniques were obtained, and similar PK parameters were estimated. A 2D LC-MS/MS method was successfully developed and validated for the quantification of a next generation biotherapeutic.

  6. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  7. Quantification of free circulating tumor DNA as a diagnostic marker for breast cancer.

    PubMed

    Catarino, Raquel; Ferreira, Maria M; Rodrigues, Helena; Coelho, Ana; Nogal, Ana; Sousa, Abreu; Medeiros, Rui

    2008-08-01

    To determine whether the amounts of circulating DNA could discriminate between breast cancer patients and healthy individuals by using real-time PCR quantification methodology. Our standard protocol for quantification of cell-free plasma DNA involved 175 consecutive patients with breast cancer and 80 healthy controls. We found increased levels of circulating DNA in breast cancer patients compared to control individuals (105.2 vs. 77.06 ng/mL, p < 0.001). We also found statistically significant differences in circulating DNA amounts in patients before and after breast surgery (105.2 vs. 59.0 ng/mL, p = 0.001). Increased plasma cell-free DNA concentration was a strong risk factor for breast cancer, conferring an increased risk for the presence of this disease (OR, 12.32; 95% CI, 2.09-52.28; p < 0.001). Quantification of circulating DNA by real-time PCR may be a good and simple tool for detection of breast cancer with a potential to clinical applicability together with other current methods used for monitoring the disease.

  8. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  9. Multilevel Model Prediction

    ERIC Educational Resources Information Center

    Frees, Edward W.; Kim, Jee-Seon

    2006-01-01

    Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…

  10. Evaluation of digital PCR for absolute RNA quantification.

    PubMed

    Sanders, Rebecca; Mason, Deborah J; Foy, Carole A; Huggett, Jim F

    2013-01-01

    Gene expression measurements detailing mRNA quantities are widely employed in molecular biology and are increasingly important in diagnostic fields. Reverse transcription (RT), necessary for generating complementary DNA, can be both inefficient and imprecise, but remains a quintessential RNA analysis tool using qPCR. This study developed a Transcriptomic Calibration Material and assessed the RT reaction using digital (d)PCR for RNA measurement. While many studies characterise dPCR capabilities for DNA quantification, less work has been performed investigating similar parameters using RT-dPCR for RNA analysis. RT-dPCR measurement using three, one-step RT-qPCR kits was evaluated using single and multiplex formats when measuring endogenous and synthetic RNAs. The best performing kit was compared to UV quantification and sensitivity and technical reproducibility investigated. Our results demonstrate assay and kit dependent RT-dPCR measurements differed significantly compared to UV quantification. Different values were reported by different kits for each target, despite evaluation of identical samples using the same instrument. RT-dPCR did not display the strong inter-assay agreement previously described when analysing DNA. This study demonstrates that, as with DNA measurement, RT-dPCR is capable of accurate quantification of low copy RNA targets, but the results are both kit and target dependent supporting the need for calibration controls.

  11. Digital PCR as a tool to measure HIV persistence.

    PubMed

    Rutsaert, Sofie; Bosman, Kobus; Trypsteen, Wim; Nijhuis, Monique; Vandekerckhove, Linos

    2018-01-30

    Although antiretroviral therapy is able to suppress HIV replication in infected patients, the virus persists and rebounds when treatment is stopped. In order to find a cure that can eradicate the latent reservoir, one must be able to quantify the persisting virus. Traditionally, HIV persistence studies have used real-time PCR (qPCR) to measure the viral reservoir represented by HIV DNA and RNA. Most recently, digital PCR is gaining popularity as a novel approach to nucleic acid quantification as it allows for absolute target quantification. Various commercial digital PCR platforms are nowadays available that implement the principle of digital PCR, of which Bio-Rad's QX200 ddPCR is currently the most used platform in HIV research. Quantification of HIV by digital PCR is proving to be a valuable improvement over qPCR as it is argued to have a higher robustness to mismatches between the primers-probe set and heterogeneous HIV, and forfeits the need for a standard curve, both of which are known to complicate reliable quantification. However, currently available digital PCR platforms occasionally struggle with unexplained false-positive partitions, and reliable segregation between positive and negative droplets remains disputed. Future developments and advancements of the digital PCR technology are promising to aid in the accurate quantification and characterization of the persistent HIV reservoir.

  12. Recommendations for Improving Identification and Quantification in Non-Targeted, GC-MS-Based Metabolomic Profiling of Human Plasma

    PubMed Central

    Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.

    2017-01-01

    The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195

  13. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  14. Development and Validation of a Taxonomy for Characterizing Measurements in Health Self-Quantification.

    PubMed

    Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando

    2017-11-03

    The use of wearable tools for health self-quantification (SQ) introduces new ways of thinking about one's body and about how to achieve desired health outcomes. Measurements from individuals, such as heart rate, respiratory volume, skin temperature, sleep, mood, blood pressure, food consumed, and quality of surrounding air can be acquired, quantified, and aggregated in a holistic way that has never been possible before. However, health SQ still lacks a formal common language or taxonomy for describing these kinds of measurements. Establishing such taxonomy is important because it would enable systematic investigations that are needed to advance in the use of wearable tools in health self-care. For a start, a taxonomy would help to improve the accuracy of database searching when doing systematic reviews and meta-analyses in this field. Overall, more systematic research would contribute to build evidence of sufficient quality to determine whether and how health SQ is a worthwhile health care paradigm. The aim of this study was to investigate a sample of SQ tools and services to build and test a taxonomy of measurements in health SQ, titled: the classification of data and activity in self-quantification systems (CDA-SQS). Eight health SQ tools and services were selected to be examined: Zeo Sleep Manager, Fitbit Ultra, Fitlinxx Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, and uBiome. An open coding analytical approach was used to find all the themes related to the research aim. This study distinguished three types of measurements in health SQ: body structures and functions, body actions and activities, and around the body. The CDA-SQS classification should be applicable to align health SQ measurement data from people with many different health objectives, health states, and health conditions. CDA-SQS is a critical contribution to a much more consistent way of studying health SQ. ©Manal Almalki, Kathleen Gray, Fernando Martin-Sanchez. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 03.11.2017.

  15. Development and Validation of a Taxonomy for Characterizing Measurements in Health Self-Quantification

    PubMed Central

    2017-01-01

    Background The use of wearable tools for health self-quantification (SQ) introduces new ways of thinking about one’s body and about how to achieve desired health outcomes. Measurements from individuals, such as heart rate, respiratory volume, skin temperature, sleep, mood, blood pressure, food consumed, and quality of surrounding air can be acquired, quantified, and aggregated in a holistic way that has never been possible before. However, health SQ still lacks a formal common language or taxonomy for describing these kinds of measurements. Establishing such taxonomy is important because it would enable systematic investigations that are needed to advance in the use of wearable tools in health self-care. For a start, a taxonomy would help to improve the accuracy of database searching when doing systematic reviews and meta-analyses in this field. Overall, more systematic research would contribute to build evidence of sufficient quality to determine whether and how health SQ is a worthwhile health care paradigm. Objective The aim of this study was to investigate a sample of SQ tools and services to build and test a taxonomy of measurements in health SQ, titled: the classification of data and activity in self-quantification systems (CDA-SQS). Methods Eight health SQ tools and services were selected to be examined: Zeo Sleep Manager, Fitbit Ultra, Fitlinxx Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, and uBiome. An open coding analytical approach was used to find all the themes related to the research aim. Results This study distinguished three types of measurements in health SQ: body structures and functions, body actions and activities, and around the body. Conclusions The CDA-SQS classification should be applicable to align health SQ measurement data from people with many different health objectives, health states, and health conditions. CDA-SQS is a critical contribution to a much more consistent way of studying health SQ. PMID:29101092

  16. Quantification of 2D elemental distribution maps of intermediate-thick biological sections by low energy synchrotron μ-X-ray fluorescence spectrometry

    NASA Astrophysics Data System (ADS)

    Kump, P.; Vogel-Mikuš, K.

    2018-05-01

    Two fundamental-parameter (FP) based models for quantification of 2D elemental distribution maps of intermediate-thick biological samples by synchrotron low energy μ-X-ray fluorescence spectrometry (SR-μ-XRF) are presented and applied to the elemental analysis in experiments with monochromatic focused photon beam excitation at two low energy X-ray fluorescence beamlines—TwinMic, Elettra Sincrotrone Trieste, Italy, and ID21, ESRF, Grenoble, France. The models assume intermediate-thick biological samples composed of measured elements, the sources of the measurable spectral lines, and by the residual matrix, which affects the measured intensities through absorption. In the first model a fixed residual matrix of the sample is assumed, while in the second model the residual matrix is obtained by the iteration refinement of elemental concentrations and an adjusted residual matrix. The absorption of the incident focused beam in the biological sample at each scanned pixel position, determined from the output of a photodiode or a CCD camera, is applied as a control in the iteration procedure of quantification.

  17. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    PubMed

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  18. The principles of quantification applied to in vivo proton MR spectroscopy.

    PubMed

    Helms, Gunther

    2008-08-01

    Following the identification of metabolite signals in the in vivo MR spectrum, quantification is the procedure to estimate numerical values of their concentrations. The two essential steps are discussed in detail: analysis by fitting a model of prior knowledge, that is, the decomposition of the spectrum into the signals of singular metabolites; then, normalization of these signals to yield concentration estimates. Special attention is given to using the in vivo water signal as internal reference.

  19. Universal Quantification in a Constraint-Based Planner

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.

  20. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  1. Targeted quantification of low ng/mL level proteins in human serum without immunoaffinity depletion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Sun, Xuefei; Gao, Yuqian

    2013-07-05

    We recently reported an antibody-free targeted protein quantification strategy, termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM) for achieving significantly enhanced sensitivity using selected reaction monitoring (SRM) mass spectrometry. Integrating PRISM with front-end IgY14 immunoaffinity depletion, sensitive detection of targeted proteins at 50-100 pg/mL levels in human blood plasma/serum was demonstrated. However, immunoaffinity depletion is often associated with undesired losses of target proteins of interest. Herein we report further evaluation of PRISM-SRM quantification of low-abundance serum proteins without immunoaffinity depletion and the multiplexing potential of this technique. Limits of quantification (LOQs) at low ng/mL levels with a medianmore » CV of ~12% were achieved for proteins spiked into human female serum using as little as 2 µL serum. PRISM-SRM provided up to ~1000-fold improvement in the LOQ when compared to conventional SRM measurements. Multiplexing capability of PRISM-SRM was also evaluated by two sets of serum samples with 6 and 21 target peptides spiked at the low attomole/µL levels. The results from SRM measurements for pooled or post-concatenated samples were comparable to those obtained from individual peptide fractions in terms of signal-to-noise ratios and SRM peak area ratios of light to heavy peptides. PRISM-SRM was applied to measure several ng/mL-level endogenous plasma proteins, including prostate-specific antigen, in clinical patient sera where correlation coefficients > 0.99 were observed between the results from PRISM-SRM and ELISA assays. Our results demonstrate that PRISM-SRM can be successfully used for quantification of low-abundance endogenous proteins in highly complex samples. Moderate throughput (50 samples/week) can be achieved by applying the post-concatenation or fraction multiplexing strategies. We anticipate broad applications for targeted PRISM-SRM quantification of low-abundance cellular proteins in systems biology studies as well as candidate biomarkers in biofluids.« less

  2. Accuracy of Rhenium-188 SPECT/CT activity quantification for applications in radionuclide therapy using clinical reconstruction methods.

    PubMed

    Esquinas, Pedro L; Uribe, Carlos F; Gonzalez, M; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna

    2017-07-20

    The main applications of 188 Re in radionuclide therapies include trans-arterial liver radioembolization and palliation of painful bone-metastases. In order to optimize 188 Re therapies, the accurate determination of radiation dose delivered to tumors and organs at risk is required. Single photon emission computed tomography (SPECT) can be used to perform such dosimetry calculations. However, the accuracy of dosimetry estimates strongly depends on the accuracy of activity quantification in 188 Re images. In this study, we performed a series of phantom experiments aiming to investigate the accuracy of activity quantification for 188 Re SPECT using high-energy and medium-energy collimators. Objects of different shapes and sizes were scanned in Air, non-radioactive water (Cold-water) and water with activity (Hot-water). The ordered subset expectation maximization algorithm with clinically available corrections (CT-based attenuation, triple-energy window (TEW) scatter and resolution recovery was used). For high activities, the dead-time corrections were applied. The accuracy of activity quantification was evaluated using the ratio of the reconstructed activity in each object to this object's true activity. Each object's activity was determined with three segmentation methods: a 1% fixed threshold (for cold background), a 40% fixed threshold and a CT-based segmentation. Additionally, the activity recovered in the entire phantom, as well as the average activity concentration of the phantom background were compared to their true values. Finally, Monte-Carlo simulations of a commercial [Formula: see text]-camera were performed to investigate the accuracy of the TEW method. Good quantification accuracy (errors  <10%) was achieved for the entire phantom, the hot-background activity concentration and for objects in cold background segmented with a 1% threshold. However, the accuracy of activity quantification for objects segmented with 40% threshold or CT-based methods decreased (errors  >15%), mostly due to partial-volume effects. The Monte-Carlo simulations confirmed that TEW-scatter correction applied to 188 Re, although practical, yields only approximate estimates of the true scatter.

  3. Contrast-enhanced ultrasonography for the detection of joint vascularity in arthritis--subjective grading versus computer-aided objective quantification.

    PubMed

    Klauser, A S; Franz, M; Bellmann Weiler, R; Gruber, J; Hartig, F; Mur, E; Wick, M C; Jaschke, W

    2011-12-01

    To compare joint inflammation assessment using subjective grading of power Doppler ultrasonography (PDUS) and contrast-enhanced ultrasonography (CEUS) versus computer-aided objective CEUS quantification. 37 joints of 28 patients with arthritis of different etiologies underwent B-mode ultrasonography, PDUS, and CEUS using a second-generation contrast agent. Synovial thickness, extent of vascularized pannus and intensity of vascularization were included in a 4-point PDUS and CEUS grading system. Subjective CEUS and PDUS scores were compared to computer-aided objective CEUS quantification using Qontrast® software for the calculation of the signal intensity (SI) and the ratio of SI for contrast enhancement. The interobserver agreement for subjective scoring was good to excellent (κ = 0.8 - 1.0; P < 0.0001). Computer-aided objective CEUS quantification correlated statistically significantly with subjective CEUS (P < 0.001) and PDUS grading (P < 0.05). The Qontrast® SI ratio correlated with subjective CEUS (P < 0.02) and PDUS grading (P < 0.03). Clinical activity did not correlate with vascularity or synovial thickening (P = N. S.) and no correlation between synovial thickening and vascularity extent could be found, neither using PDUS nor CEUS (P = N. S.). Both subjective CEUS grading and objective CEUS quantification are valuable for assessing joint vascularity in arthritis and computer-aided CEUS quantification may be a suitable objective tool for therapy follow-up in arthritis. © Georg Thieme Verlag KG Stuttgart · New York.

  4. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    PubMed

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  5. Experimental validation of 2D uncertainty quantification for DIC.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reu, Phillip L.

    Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual testmore » images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.« less

  6. Experimental validation of 2D uncertainty quantification for digital image correlation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reu, Phillip L.

    Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual testmore » images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.« less

  7. Simultaneous enantioselective quantification of fluoxetine and norfluoxetine in human milk by direct sample injection using 2-dimensional liquid chromatography-tandem mass spectrometry.

    PubMed

    Alvim, Joel; Lopes, Bianca Rebelo; Cass, Quezia Bezerra

    2016-06-17

    A two-dimensional liquid chromatography system coupled to triple quadrupole tandem mass spectrometer (2D LC-MS/MS) was employed for the simultaneously quantification of fluoxetine (FLX) and norfluoxetine (NFLX) enantiomers in human milk by direct injection of samples. A restricted access media of bovine serum albumin octadecyl column (RAM-BSAC18) was used in the first dimension for the milk proteins depletion, while an antibiotic-based chiral column was used in the second dimension. The results herein described show good selectivity, extraction efficiency, accuracy, and precision with limits of quantification in the order of 7.5ngmL(-1)for the FLX enantiomers and 10.0ngmL(-1) for NFLX enantiomers. Furthermore, it represents a practical tool in terms of sustainability for the sample preparation of such a difficult matrix. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Investigation of complexity dynamics in a DC glow discharge magnetized plasma using recurrence quantification analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitra, Vramori; Sarma, Bornali; Sarma, Arun

    Recurrence is an ubiquitous feature which provides deep insights into the dynamics of real dynamical systems. A suitable tool for investigating recurrences is recurrence quantification analysis (RQA). It allows, e.g., the detection of regime transitions with respect to varying control parameters. We investigate the complexity of different coexisting nonlinear dynamical regimes of the plasma floating potential fluctuations at different magnetic fields and discharge voltages by using recurrence quantification variables, in particular, DET, L{sub max}, and Entropy. The recurrence analysis reveals that the predictability of the system strongly depends on discharge voltage. Furthermore, the persistent behaviour of the plasma time seriesmore » is characterized by the Detrended fluctuation analysis technique to explore the complexity in terms of long range correlation. The enhancement of the discharge voltage at constant magnetic field increases the nonlinear correlations; hence, the complexity of the system decreases, which corroborates the RQA analysis.« less

  9. Detection and quantification of serum or plasma HCV RNA: mini review of commercially available assays.

    PubMed

    Le Guillou-Guillemette, Helene; Lunel-Fabiani, Francoise

    2009-01-01

    The treatment schedule (combination of compounds, doses, and duration) and the virological follow-up for management of antiviral treatment in patients chronically infected by HCV is now well standardized, but to ensure good monitoring of the treated patients, physicians need rapid, reproducible, and sensitive molecular virological tools with a wide range of detection and quantification of HCV RNA in blood samples. Several assays for detection and/or quantification of HCV RNA are currently commercially available. Here, all these assays are detailed, and a brief description of each step of the assay is provided. They are divided into two categories by method: those based on signal amplification and those based on target amplification. These two categories are then divided into qualitative, quantitative, and quantitative detection assays. The real-time reverse-transcription polymerase chain reaction (RT-PCR)-based assays are the most promising strategy in the HCV virological area.

  10. A fault tree model to assess probability of contaminant discharge from shipwrecks.

    PubMed

    Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I

    2014-11-15

    Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  12. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*

    PubMed Central

    Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-01-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314

  13. Absolute quantification of Dehalococcoides proteins: enzyme bioindicators of chlorinated ethene dehalorespiration.

    PubMed

    Werner, Jeffrey J; Ptak, A Celeste; Rahm, Brian G; Zhang, Sheng; Richardson, Ruth E

    2009-10-01

    The quantification of trace proteins in complex environmental samples and mixed microbial communities would be a valuable monitoring tool in countless applications, including the bioremediation of groundwater contaminated with chlorinated solvents. Measuring the concentrations of specific proteins provides unique information about the activity and physiological state of organisms in a sample. We developed sensitive (< 5 fmol), selective bioindicator assays for the absolute quantification of select proteins used by Dehalococcoides spp. when reducing carbon atoms in the common pollutants trichloroethene (TCE) and tetrachloroethene (PCE). From complex whole-sample digests of two different dechlorinating mixed communities, we monitored the chromatographic peaks of selected tryptic peptides chosen to represent 19 specific Dehalococcoides proteins. This was accomplished using multiple-reaction monitoring (MRM) assays using nano-liquid chromatography-tandem mass spectrometry (nLC-MS/MS), which provided the selectivity, sensitivity and reproducibility required to quantify Dehalococcoides proteins in complex samples. We observed reproducible peak areas (average CV = 0.14 over 4 days, n = 3) and linear responses in standard curves (n = 5, R(2) > 0.98) using synthetic peptide standards spiked into a background matrix of sediment peptides. We detected and quantified TCE reductive dehalogenase (TceA) at 7.6 +/- 1.7 x 10(3) proteins cell(-1) in the KB1 bioaugmentation culture, previously thought to be lacking TceA. Fragmentation data from MS/MS shotgun proteomics experiments were helpful in developing the MRM targets. Similar shotgun proteomics data are emerging in labs around the world for many environmentally relevant microbial proteins, and these data are a valuable resource for the future development of MRM assays. We expect targeted peptide quantification in environmental samples to be a useful tool in environmental monitoring.

  14. Use of Multiple Competitors for Quantification of Human Immunodeficiency Virus Type 1 RNA in Plasma

    PubMed Central

    Vener, Tanya; Nygren, Malin; Andersson, AnnaLena; Uhlén, Mathias; Albert, Jan; Lundeberg, Joakim

    1998-01-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens. PMID:9650926

  15. Quantification of ligand density and stoichiometry on the surface of liposomes using single-molecule fluorescence imaging.

    PubMed

    Belfiore, Lisa; Spenkelink, Lisanne M; Ranson, Marie; van Oijen, Antoine M; Vine, Kara L

    2018-05-28

    Despite the longstanding existence of liposome technology in drug delivery applications, there have been no ligand-directed liposome formulations approved for clinical use to date. This lack of translation is due to several factors, one of which is the absence of molecular tools for the robust quantification of ligand density on the surface of liposomes. We report here for the first time the quantification of proteins attached to the surface of small unilamellar liposomes using single-molecule fluorescence imaging. Liposomes were surface-functionalized with fluorescently labeled human proteins previously validated to target the cancer cell surface biomarkers plasminogen activator inhibitor-2 (PAI-2) and trastuzumab (TZ, Herceptin®). These protein-conjugated liposomes were visualized using a custom-built wide-field fluorescence microscope with single-molecule sensitivity. By counting the photobleaching steps of the fluorescently labeled proteins, we calculated the number of attached proteins per liposome, which was 11 ± 4 proteins for single-ligand liposomes. Imaging of dual-ligand liposomes revealed stoichiometries of the two attached proteins in accordance with the molar ratios of protein added during preparation. Preparation of PAI-2/TZ dual-ligand liposomes via two different methods revealed that the post-insertion method generated liposomes with a more equal representation of the two differently sized proteins, demonstrating the ability of this preparation method to enable better control of liposome protein densities. We conclude that the single-molecule imaging method presented here is an accurate and reliable quantification tool for determining ligand density and stoichiometry on the surface of liposomes. This method has the potential to allow for comprehensive characterization of novel ligand-directed liposomes that should facilitate the translation of these nanotherapies through to the clinic. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Quantifying motivational deficits and apathy: a review of the literature.

    PubMed

    Weiser, Mark; Garibaldi, George

    2015-08-01

    Varying definitions of apathy in the published literature and a lack of a consensus regarding diagnostic criteria make the identification and quantification of apathy difficult in both clinical trials and clinical practice. The Apathy Evaluation Scale was developed specifically to assess apathy, but variations in the threshold values defined for clinically significant apathy diminish its use as a screening tool in clinical trials, although it has demonstrated sensitivity to changes in treatment in a number of studies. The Neuropsychiatric Inventory contains an Apathy subscale, which has been used to identify clinical trial populations (with a consistent threshold value) and measure changes following treatment. Few of the other assessment tools currently used in patients with neuropsychiatric disorders are specific for apathy or explore it in any depth, most have not been validated in the general population, do not have cut-off points representing clinically significant apathy, and its changes over time and in response to treatment. Further research is required to address these issues in order to facilitate the quantification of apathy and its natural history. Such research should be conducted with the aim of developing new, specific tools for use across neuropsychiatric disorders. Copyright © 2015. Published by Elsevier B.V.

  17. MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.

    PubMed

    Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui

    2015-12-12

    Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.

  18. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data.

    PubMed

    Zeng, Ying-Xu; Mjøs, Svein Are; David, Fabrice P A; Schmid, Adrien W

    2016-03-31

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Ecosystem evapotranspiration: Challenges in measurements, estimates, and modeling

    USDA-ARS?s Scientific Manuscript database

    Evapotranspiration (ET) processes at the leaf-to-landscape scales in multiple land uses have important controls and feedbacks for the local, regional and global climate and water resource systems. Innovative methods, tools, and technologies for improved understanding and quantification of ET and cro...

  20. Quantification of Soil Redoximorphic Features by Standardized Color Identification

    USDA-ARS?s Scientific Manuscript database

    Photography has been a welcome tool in assisting to document and convey qualitative soil information. Greater availability of digital cameras with increased information storage capabilities has promoted novel uses of this technology in investigations of water movement patterns, organic matter conte...

  1. Molecular methods for pathogen detection and quantification

    USDA-ARS?s Scientific Manuscript database

    Ongoing interest in convenient, inexpensive, fast, sensitive and accurate techniques for detecting and/or quantifying the presence of soybean pathogens has resulted in increased usage of molecular tools. The method of extracting a molecular target (usually DNA or RNA) for detection depends wholly up...

  2. Epsilon-Q: An Automated Analyzer Interface for Mass Spectral Library Search and Label-Free Protein Quantification.

    PubMed

    Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Paik, Young-Ki

    2017-12-01

    Mass spectrometry (MS) is a widely used proteome analysis tool for biomedical science. In an MS-based bottom-up proteomic approach to protein identification, sequence database (DB) searching has been routinely used because of its simplicity and convenience. However, searching a sequence DB with multiple variable modification options can increase processing time, false-positive errors in large and complicated MS data sets. Spectral library searching is an alternative solution, avoiding the limitations of sequence DB searching and allowing the detection of more peptides with high sensitivity. Unfortunately, this technique has less proteome coverage, resulting in limitations in the detection of novel and whole peptide sequences in biological samples. To solve these problems, we previously developed the "Combo-Spec Search" method, which uses manually multiple references and simulated spectral library searching to analyze whole proteomes in a biological sample. In this study, we have developed a new analytical interface tool called "Epsilon-Q" to enhance the functions of both the Combo-Spec Search method and label-free protein quantification. Epsilon-Q performs automatically multiple spectral library searching, class-specific false-discovery rate control, and result integration. It has a user-friendly graphical interface and demonstrates good performance in identifying and quantifying proteins by supporting standard MS data formats and spectrum-to-spectrum matching powered by SpectraST. Furthermore, when the Epsilon-Q interface is combined with the Combo-Spec search method, called the Epsilon-Q system, it shows a synergistic function by outperforming other sequence DB search engines for identifying and quantifying low-abundance proteins in biological samples. The Epsilon-Q system can be a versatile tool for comparative proteome analysis based on multiple spectral libraries and label-free quantification.

  3. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    PubMed

    Lhachimi, Stefan K; Nusselder, Wilma J; Smit, Henriette A; van Baal, Pieter; Baili, Paolo; Bennett, Kathleen; Fernández, Esteve; Kulik, Margarete C; Lobstein, Tim; Pomerleau, Joceline; Mackenbach, Johan P; Boshuizen, Hendriek C

    2012-01-01

    Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA). A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states) and three usability criteria (modest data requirements, rich model output, generally accessible) to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment), we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based on epidemiologic evidence.

  4. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  5. Quantification of real thermal, catalytic, and hydrodeoxygenated bio-oils via comprehensive two-dimensional gas chromatography with mass spectrometry.

    PubMed

    Silva, Raquel V S; Tessarolo, Nathalia S; Pereira, Vinícius B; Ximenes, Vitor L; Mendes, Fábio L; de Almeida, Marlon B B; Azevedo, Débora A

    2017-03-01

    The elucidation of bio-oil composition is important to evaluate the processes of biomass conversion and its upgrading, and to suggest the proper use for each sample. Comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry (GC×GC-TOFMS) is a widely applied analytical approach for bio-oil investigation due to the higher separation and resolution capacity from this technique. This work addresses the issue of analytical performance to assess the comprehensive characterization of real bio-oil samples via GC×GC-TOFMS. The approach was applied to the individual quantification of compounds of real thermal (PWT), catalytic process (CPO), and hydrodeoxygenation process (HDO) bio-oils. Quantification was performed with reliability using the analytical curves of oxygenated and hydrocarbon standards as well as the deuterated internal standards. The limit of quantification was set at 1ngµL -1 for major standards, except for hexanoic acid, which was set at 5ngµL -1 . The GC×GC-TOFMS method provided good precision (<10%) and excellent accuracy (recovery range of 70-130%) for the quantification of individual hydrocarbons and oxygenated compounds in real bio-oil samples. Sugars, furans, and alcohols appear as the major constituents of the PWT, CPO, and HDO samples, respectively. In order to obtain bio-oils with better quality, the catalytic pyrolysis process may be a better option than hydrogenation due to the effective reduction of oxygenated compound concentrations and the lower cost of the process, when hydrogen is not required to promote deoxygenation in the catalytic pyrolysis process. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Validation and implementation of liquid chromatographic-mass spectrometric (LC-MS) methods for the quantification of tenofovir prodrugs.

    PubMed

    Hummert, Pamela; Parsons, Teresa L; Ensign, Laura M; Hoang, Thuy; Marzinke, Mark A

    2018-04-15

    The nucleotide reverse transcriptase inhibitor tenofovir (TFV) is widely administered in a disoproxil prodrug form (tenofovir disoproxil fumarate, TDF) for HIV management and prevention. Recently, novel prodrugs tenofovir alafenamide fumarate (TAF) and hexadecyloxypropyl tenofovir (CMX157) have been pursued for HIV treatment while minimizing adverse effects associated with systemic TFV exposure. Dynamic and sensitive bioanalytical tools are required to characterize the pharmacokinetics of these prodrugs in systemic circulation. Two parallel methods have been developed, one to combinatorially quantify TAF and TFV, and a second method for CMX157 quantification, in plasma. K 2 EDTA plasma was spiked with TAF and TFV, or CMX157. Following the addition of isotopically labeled internal standards and sample extraction via solid phase extraction (TAF and TFV) or protein precipitation (CMX157), samples were subjected to liquid chromatographic-tandem mass spectrometric (LC-MS/MS) analysis. For TAF and TFV, separation occurred using a Zorbax Eclipse Plus C18 Narrow Bore RR, 2.1 × 50 mm, 3.5 μm column and analytes were detected on an API5000 mass analyzer; CMX157 was separated using a Kinetex C8, 2.1 × 50 mm, 2.6 μm column and quantified using an API4500 mass spectrometer. Methods were validated according to FDA Bioanalytical Method Validation guidelines. Analytical methods: were optimized for the multiplexed monitoring of TAF and TFV, and CMX157 in plasma. The lower limits of quantification (LLOQs) for TAF, TFV, and CMX157 were 0.03, 1.0, and 0.25 ng/mL, respectively. Calibration curves were generated via weighted linear regression of standards. Intra- and inter-assay precision and accuracy studies demonstrated %CVs ≤ 14.4% and %DEVs ≤ ± 7.95%, respectively. Stability and matrix effects studies were also performed. All results were acceptable and in accordance with the recommended guidelines for bioanalytical methods. Assays were also applied to quantify in vivo concentrations of prodrugs and TFV in a preclinical study post-rectal administration. Sensitive, specific, and dynamic LC-MS/MS assays have been developed and validated for the multiplexed quantification TAF and TFV, as well as an independent assay for CMX157 quantification, in plasma. The described methods meet sufficient throughput criteria to support large research trials. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Attomolar quantitation of Mycobacterium tuberculosis by asymmetric helicase-dependent isothermal DNA-amplification and electrochemical detection.

    PubMed

    Barreda-García, Susana; González-Álvarez, María José; de-Los-Santos-Álvarez, Noemí; Palacios-Gutiérrez, Juan José; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús

    2015-06-15

    A highly sensitive and robust method for the quantification of specific DNA sequences based on coupling asymmetric helicase-dependent DNA amplification to electrochemical detection is described. This method relies on the entrapment of the amplified ssDNA sequences on magnetic beads followed by a post-amplification hybridization assay to provide an added degree of specificity. As a proof-of-concept a 84-bases long sequence specific of Mycobacterium tuberculosis is amplified at 65°C, providing 3×10(6) amplification after 90 min. Using this system 0.5 aM, corresponding to 15 copies of the target gene in 50 µL of sample, can be successfully detected and reliably quantified under isothermal conditions in less than 4h. The assay has been applied to the detection of M. tuberculosis in sputum, pleural fluid and urine samples. Besides this application, the proposed assays is a powerful and general tool for molecular diagnostic that can be applied to the detection of other specific DNA sequences, taking full advantage of the plethora of genomic information now available. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Particle size distribution as a useful tool for microbial detection.

    PubMed

    Chavez, A; Jimenez, B; Maya, C

    2004-01-01

    Worldwide, raw or treated wastewater is used for irrigation. However, this practice implies that the microbial content must be controlled. Unfortunately, detection techniques for microorganisms are costly, time consuming, and require highly trained personnel. For these reasons, this study used particle size distribution to measure the microbial quality of wastewater through correlations between the number or volume of particles and the concentration of fecal coliforms, Salmonella spp. and helminth ova. Such correlations were obtained for both raw and chemically treated wastewater. The best fit was the one for helminth ova, which applies for both the influent and effluent and also for all the coagulants involved. This technique allows the on-line quantification of helminth ova at a cost of US$3 and it takes only 5 minutes, instead of the US$70 and 5 days for the standard technique. With respect to the coagulants applied, their behavior is different only for particles smaller than 8 microm, and thus this value is considered as the critical size for this particular treatment. The best coagulant was the aluminium polychloride. In addition, this work establishes the distribution of COD, TSS, nitrogen, and phosphorous for particles smaller and larger than 20 microm.

  9. Speech rhythm alterations in Spanish-speaking individuals with Alzheimer's disease.

    PubMed

    Martínez-Sánchez, Francisco; Meilán, Juan J G; Vera-Ferrandiz, Juan Antonio; Carro, Juan; Pujante-Valverde, Isabel M; Ivanova, Olga; Carcavilla, Nuria

    2017-07-01

    Rhythm is the speech property related to the temporal organization of sounds. Considerable evidence is now available for suggesting that dementia of Alzheimer's type is associated with impairments in speech rhythm. The aim of this study is to assess the use of an automatic computerized system for measuring speech rhythm characteristics in an oral reading task performed by 45 patients with Alzheimer's disease (AD) compared with those same characteristics among 82 healthy older adults without a diagnosis of dementia, and matched by age, sex and cultural background. Ranges of rhythmic-metric and clinical measurements were applied. The results show rhythmic differences between the groups, with higher variability of syllabic intervals in AD patients. Signal processing algorithms applied to oral reading recordings prove to be capable of differentiating between AD patients and older adults without dementia with an accuracy of 87% (specificity 81.7%, sensitivity 82.2%), based on the standard deviation of the duration of syllabic intervals. Experimental results show that the syllabic variability measurements extracted from the speech signal can be used to distinguish between older adults without a diagnosis of dementia and those with AD, and may be useful as a tool for the objective study and quantification of speech deficits in AD.

  10. Fast Detection of Copper Content in Rice by Laser-Induced Breakdown Spectroscopy with Uni- and Multivariate Analysis.

    PubMed

    Liu, Fei; Ye, Lanhan; Peng, Jiyu; Song, Kunlin; Shen, Tingting; Zhang, Chu; He, Yong

    2018-02-27

    Fast detection of heavy metals is very important for ensuring the quality and safety of crops. Laser-induced breakdown spectroscopy (LIBS), coupled with uni- and multivariate analysis, was applied for quantitative analysis of copper in three kinds of rice (Jiangsu rice, regular rice, and Simiao rice). For univariate analysis, three pre-processing methods were applied to reduce fluctuations, including background normalization, the internal standard method, and the standard normal variate (SNV). Linear regression models showed a strong correlation between spectral intensity and Cu content, with an R 2 more than 0.97. The limit of detection (LOD) was around 5 ppm, lower than the tolerance limit of copper in foods. For multivariate analysis, partial least squares regression (PLSR) showed its advantage in extracting effective information for prediction, and its sensitivity reached 1.95 ppm, while support vector machine regression (SVMR) performed better in both calibration and prediction sets, where R c 2 and R p 2 reached 0.9979 and 0.9879, respectively. This study showed that LIBS could be considered as a constructive tool for the quantification of copper contamination in rice.

  11. Fast Detection of Copper Content in Rice by Laser-Induced Breakdown Spectroscopy with Uni- and Multivariate Analysis

    PubMed Central

    Ye, Lanhan; Song, Kunlin; Shen, Tingting

    2018-01-01

    Fast detection of heavy metals is very important for ensuring the quality and safety of crops. Laser-induced breakdown spectroscopy (LIBS), coupled with uni- and multivariate analysis, was applied for quantitative analysis of copper in three kinds of rice (Jiangsu rice, regular rice, and Simiao rice). For univariate analysis, three pre-processing methods were applied to reduce fluctuations, including background normalization, the internal standard method, and the standard normal variate (SNV). Linear regression models showed a strong correlation between spectral intensity and Cu content, with an R2 more than 0.97. The limit of detection (LOD) was around 5 ppm, lower than the tolerance limit of copper in foods. For multivariate analysis, partial least squares regression (PLSR) showed its advantage in extracting effective information for prediction, and its sensitivity reached 1.95 ppm, while support vector machine regression (SVMR) performed better in both calibration and prediction sets, where Rc2 and Rp2 reached 0.9979 and 0.9879, respectively. This study showed that LIBS could be considered as a constructive tool for the quantification of copper contamination in rice. PMID:29495445

  12. Comparison of non-invasive assessment of liver fibrosis in patients with alpha1-antitrypsin deficiency using magnetic resonance elastography (MRE), acoustic radiation force impulse (ARFI) Quantification, and 2D-shear wave elastography (2D-SWE).

    PubMed

    Reiter, Rolf; Wetzel, Martin; Hamesch, Karim; Strnad, Pavel; Asbach, Patrick; Haas, Matthias; Siegmund, Britta; Trautwein, Christian; Hamm, Bernd; Klatt, Dieter; Braun, Jürgen; Sack, Ingolf; Tzschätzsch, Heiko

    2018-01-01

    Although it has been known for decades that patients with alpha1-antitrypsin deficiency (AATD) have an increased risk of cirrhosis and hepatocellular carcinoma, limited data exist on non-invasive imaging-based methods for assessing liver fibrosis such as magnetic resonance elastography (MRE) and acoustic radiation force impulse (ARFI) quantification, and no data exist on 2D-shear wave elastography (2D-SWE). Therefore, the purpose of this study is to evaluate and compare the applicability of different elastography methods for the assessment of AATD-related liver fibrosis. Fifteen clinically asymptomatic AATD patients (11 homozygous PiZZ, 4 heterozygous PiMZ) and 16 matched healthy volunteers were examined using MRE and ARFI quantification. Additionally, patients were examined with 2D-SWE. A high correlation is evident for the shear wave speed (SWS) determined with different elastography methods in AATD patients: 2D-SWE/MRE, ARFI quantification/2D-SWE, and ARFI quantification/MRE (R = 0.8587, 0.7425, and 0.6914, respectively; P≤0.0089). Four AATD patients with pathologically increased SWS were consistently identified with all three methods-MRE, ARFI quantification, and 2D-SWE. The high correlation and consistent identification of patients with pathologically increased SWS using MRE, ARFI quantification, and 2D-SWE suggest that elastography has the potential to become a suitable imaging tool for the assessment of AATD-related liver fibrosis. These promising results provide motivation for further investigation of non-invasive assessment of AATD-related liver fibrosis using elastography.

  13. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  14. Community Project for Accelerator Science and Simulation (ComPASS) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cary, John R.; Cowan, Benjamin M.; Veitzer, S. A.

    2016-03-04

    Tech-X participated across the full range of ComPASS activities, with efforts in the Energy Frontier primarily through modeling of laser plasma accelerators and dielectric laser acceleration, in the Intensity Frontier primarily through electron cloud modeling, and in Uncertainty Quantification being applied to dielectric laser acceleration. In the following we present the progress and status of our activities for the entire period of the ComPASS project for the different areas of Energy Frontier, Intensity Frontier and Uncertainty Quantification.

  15. The Infeasibility of Experimental Quantification of Life-Critical Software Reliability

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Finelli, George B.

    1991-01-01

    This paper affirms that quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The key assumption of software fault tolerance|separately programmed versions fail independently|is shown to be problematic. This assumption cannot be justified by experimentation in the ultra-reliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multi-version software experiments support this affirmation.

  16. In-line UV spectroscopy for the quantification of low-dose active ingredients during the manufacturing of pharmaceutical semi-solid and liquid formulations.

    PubMed

    Bostijn, N; Hellings, M; Van Der Veen, M; Vervaet, C; De Beer, T

    2018-07-12

    UltraViolet (UV) spectroscopy was evaluated as an innovative Process Analytical Technology (PAT) - tool for the in-line and real-time quantitative determination of low-dosed active pharmaceutical ingredients (APIs) in a semi-solid (gel) and a liquid (suspension) pharmaceutical formulation during their batch production process. The performance of this new PAT-tool (i.e., UV spectroscopy) was compared with an already more established PAT-method based on Raman spectroscopy. In-line UV measurements were carried out with an immersion probe while for the Raman measurements a non-contact PhAT probe was used. For both studied formulations, an in-line API quantification model was developed and validated per spectroscopic technique. The known API concentrations (Y) were correlated with the corresponding in-line collected preprocessed spectra (X) through a Partial Least Squares (PLS) regression. Each developed quantification method was validated by calculating the accuracy profile on the basis of the validation experiments. Furthermore, the measurement uncertainty was determined based on the data generated for the determination of the accuracy profiles. From the accuracy profile of the UV- and Raman-based quantification method for the gel, it was concluded that at the target API concentration of 2% (w/w), 95 out of 100 future routine measurements given by the Raman method will not deviate more than 10% (relative error) from the true API concentration, whereas for the UV method the acceptance limits of 10% were exceeded. For the liquid formulation, the Raman method was not able to quantify the API in the low-dosed suspension (0.09% (w/w) API). In contrast, the in-line UV method was able to adequately quantify the API in the suspension. This study demonstrated that UV spectroscopy can be adopted as a novel in-line PAT-technique for low-dose quantification purposes in pharmaceutical processes. Important is that none of the two spectroscopic techniques was superior to the other for both formulations: the Raman method was more accurate in quantifying the API in the gel (2% (w/w) API), while the UV method performed better for API quantification in the suspension (0.09% (w/w) API). Copyright © 2018 Elsevier B.V. All rights reserved.

  17. In situ Biofilm Quantification in Bioelectrochemical Systems by using Optical Coherence Tomography.

    PubMed

    Molenaar, Sam D; Sleutels, Tom; Pereira, Joao; Iorio, Matteo; Borsje, Casper; Zamudio, Julian A; Fabregat-Santiago, Francisco; Buisman, Cees J N; Ter Heijne, Annemiek

    2018-04-25

    Detailed studies of microbial growth in bioelectrochemical systems (BESs) are required for their suitable design and operation. Here, we report the use of optical coherence tomography (OCT) as a tool for in situ and noninvasive quantification of biofilm growth on electrodes (bioanodes). An experimental platform is designed and described in which transparent electrodes are used to allow real-time, 3D biofilm imaging. The accuracy and precision of the developed method is assessed by relating the OCT results to well-established standards for biofilm quantification (chemical oxygen demand (COD) and total N content) and show high correspondence to these standards. Biofilm thickness observed by OCT ranged between 3 and 90 μm for experimental durations ranging from 1 to 24 days. This translated to growth yields between 38 and 42 mgCODbiomass  gCODacetate -1 at an anode potential of -0.35 V versus Ag/AgCl. Time-lapse observations of an experimental run performed in duplicate show high reproducibility in obtained microbial growth yield by the developed method. As such, we identify OCT as a powerful tool for conducting in-depth characterizations of microbial growth dynamics in BESs. Additionally, the presented platform allows concomitant application of this method with various optical and electrochemical techniques. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  18. MAMA User Guide v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid

    Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less

  19. LFQProfiler and RNP(xl): Open-Source Tools for Label-Free Quantification and Protein-RNA Cross-Linking Integrated into Proteome Discoverer.

    PubMed

    Veit, Johannes; Sachsenberg, Timo; Chernev, Aleksandar; Aicheler, Fabian; Urlaub, Henning; Kohlbacher, Oliver

    2016-09-02

    Modern mass spectrometry setups used in today's proteomics studies generate vast amounts of raw data, calling for highly efficient data processing and analysis tools. Software for analyzing these data is either monolithic (easy to use, but sometimes too rigid) or workflow-driven (easy to customize, but sometimes complex). Thermo Proteome Discoverer (PD) is a powerful software for workflow-driven data analysis in proteomics which, in our eyes, achieves a good trade-off between flexibility and usability. Here, we present two open-source plugins for PD providing additional functionality: LFQProfiler for label-free quantification of peptides and proteins, and RNP(xl) for UV-induced peptide-RNA cross-linking data analysis. LFQProfiler interacts with existing PD nodes for peptide identification and validation and takes care of the entire quantitative part of the workflow. We show that it performs at least on par with other state-of-the-art software solutions for label-free quantification in a recently published benchmark ( Ramus, C.; J. Proteomics 2016 , 132 , 51 - 62 ). The second workflow, RNP(xl), represents the first software solution to date for identification of peptide-RNA cross-links including automatic localization of the cross-links at amino acid resolution and localization scoring. It comes with a customized integrated cross-link fragment spectrum viewer for convenient manual inspection and validation of the results.

  20. Quantitative health impact assessment: taking stock and moving forward.

    PubMed

    Fehr, Rainer; Hurley, Fintan; Mekel, Odile Cecile; Mackenbach, Johan P

    2012-12-01

    Over the past years, application of health impact assessment has increased substantially, and there has been a strong growth of tools that allow quantification of health impacts for a range of health relevant policies. We review these developments, and conclude that further tool development is no longer a main priority, although several aspects need to be further developed, such as methods to assess impacts on health inequalities and to assess uncertainties. The main new challenges are, first, to conduct a comparative evaluation of different tools, and, second, to ensure the maintenance and continued availability of the toolkits including their data contents.

  1. Investigation, quantification, and recommendations : performance of alternatively fueled buses.

    DOT National Transportation Integrated Search

    2014-08-01

    The goal of this project was to continue consistent collection and reporting of data on the performance and costs of alternatively fueled public transit vehicles in the U.S. transit fleet in order to keep the Bus Fuels Fleet Evaluation Tool (BuFFeT; ...

  2. Introduction to Sustainable Urban Engineering - National Perspective - Measuring the Magnitude of the Problem

    EPA Science Inventory

    This seminar will present previous work on the Tool for the Reduction and Assessment and of Chemical and other environmental Impacts (TRACI) along with interim research on the quantification of land use modifications for comprehensive impact assessment. Various research options ...

  3. Fruit morphological descriptors as a tool for discrimination of Daucus L. germplasm

    USDA-ARS?s Scientific Manuscript database

    Morphological diversity of a Daucus L. germplasm collection maintained at the National Gene Bank of Tunisia was assessed using fourteen morphological descriptors related to mature fruits. Quantification of variability for each character was investigated using the standardized Shannon-Weaver Diversit...

  4. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  5. Imaging tools to measure treatment response in gout.

    PubMed

    Dalbeth, Nicola; Doyle, Anthony J

    2018-01-01

    Imaging tests are in clinical use for diagnosis, assessment of disease severity and as a marker of treatment response in people with gout. Various imaging tests have differing properties for assessing the three key disease domains in gout: urate deposition (including tophus burden), joint inflammation and structural joint damage. Dual-energy CT allows measurement of urate deposition and bone damage, and ultrasonography allows assessment of all three domains. Scoring systems have been described that allow radiological quantification of disease severity and these scoring systems may play a role in assessing the response to treatment in gout. This article reviews the properties of imaging tests, describes the available scoring systems for quantification of disease severity and discusses the challenges and controversies regarding the use of imaging tools to measure treatment response in gout. © The Author 2018. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Current trends in quantitative proteomics - an update.

    PubMed

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. TAPAS: tools to assist the targeted protein quantification of human alternative splice variants.

    PubMed

    Yang, Jae-Seong; Sabidó, Eduard; Serrano, Luis; Kiel, Christina

    2014-10-15

    In proteomes of higher eukaryotes, many alternative splice variants can only be detected by their shared peptides. This makes it highly challenging to use peptide-centric mass spectrometry to distinguish and to quantify protein isoforms resulting from alternative splicing events. We have developed two complementary algorithms based on linear mathematical models to efficiently compute a minimal set of shared and unique peptides needed to quantify a set of isoforms and splice variants. Further, we developed a statistical method to estimate the splice variant abundances based on stable isotope labeled peptide quantities. The algorithms and databases are integrated in a web-based tool, and we have experimentally tested the limits of our quantification method using spiked proteins and cell extracts. The TAPAS server is available at URL http://davinci.crg.es/tapas/. luis.serrano@crg.eu or christina.kiel@crg.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. UMI-tools: modeling sequencing errors in Unique Molecular Identifiers to improve quantification accuracy

    PubMed Central

    2017-01-01

    Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584

  9. Chemotaxis of cancer cells in three-dimensional environment monitored label-free by quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Schnekenburger, Jürgen; Ketelhut, Steffi

    2017-02-01

    We investigated the capabilities of digital holographic microscopy (DHM) for label-free quantification of the response of living single cells to chemical stimuli in 3D assays. Fibro sarcoma cells were observed in a collagen matrix inside 3D chemotaxis chambers with a Mach-Zehnder interferometer-based DHM setup. From the obtained series of quantitative phase images, the migration trajectories of single cells were retrieved by automated cell tracking and subsequently analyzed for maximum migration distance and motility. Our results demonstrate DHM as a highly reliable and efficient tool for label-free quantification of chemotaxis in 2D and 3D environments.

  10. Adjoint-Based Uncertainty Quantification with MCNP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less

  11. ddpcr: an R package and web application for analysis of droplet digital PCR data.

    PubMed

    Attali, Dean; Bidshahri, Roza; Haynes, Charles; Bryan, Jennifer

    2016-01-01

    Droplet digital polymerase chain reaction (ddPCR) is a novel platform for exact quantification of DNA which holds great promise in clinical diagnostics. It is increasingly popular due to its digital nature, which provides more accurate quantification and higher sensitivity than traditional real-time PCR. However, clinical adoption has been slowed in part by the lack of software tools available for analyzing ddPCR data. Here, we present ddpcr - a new R package for ddPCR visualization and analysis. In addition, ddpcr includes a web application (powered by the Shiny R package) that allows users to analyze ddPCR data using an interactive graphical interface.

  12. Development and validation of a specific and sensitive HPLC-ESI-MS method for quantification of lysophosphatidylinositols and evaluation of their levels in mice tissues.

    PubMed

    Masquelier, Julien; Muccioli, Giulio G

    2016-07-15

    Increasing evidence suggests that lysophosphatidylinositols (LPIs), a subspecies of lysophospholipids, are important endogenous mediators. Although LPIs long remained among the less studied lysophospholipids, the identification of GPR55 as their molecular target sparked a renewed interest in the study of these bioactive lipids. Furthermore, increasing evidence points towards a role for LPIs in cancer development. However, a better understanding of the role and functions of LPIs in physiology and disease requires methods that allow for the quantification of LPI levels in cells and tissues. Because dedicated efficient methods for quantifying LPIs were missing, we decided to develop and validate an HPLC-ESI-MS method for the quantification of LPI species from tissues. LPIs are extracted from tissues by liquid/liquid extraction, pre-purified by solid-phase extraction, and finally analyzed by HPLC-ESI-MS. We determined the method's specificity and selectivity, we established calibration curves, determined the carry over (< 2%), LOD and LLOQ (between 0.116-7.82 and 4.62-92.5pmol on column, respectively), linearity (0.988 80%), intermediate precision (CV<20%) as well as the recovery from tissues. We then applied the method to determine the relative abundance of the LPI species in 15 different mouse tissues. Finally, we quantified the absolute LPI levels in six different mouse tissues. We found that while 18:0 LPI represents more than 60% of all the LPI species in the periphery (e.g. liver, gastrointestinal tract, lungs, spleen) it is much less abundant in the central nervous system where the levels of 20:4 LPI are significantly higher. Thus this validated HPLC-ESI-MS method for quantifying LPIs represents a powerful tool that will facilitate the comprehension of the pathophysiological roles of LPIs. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. New validated LC-MS/MS method for the determination of three alkylated adenines in human urine and its application to the monitoring of alkylating agents in cigarette smoke.

    PubMed

    Tian, Yongfeng; Hou, Hongwei; Zhang, Xiaotao; Wang, An; Liu, Yong; Hu, Qingyuan

    2014-09-01

    A highly specific liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed for simultaneous determination of urinary N(3)-methyladenine (N(3)-MeA), N(3)-ethyladenine (N(3)-EtA), and N(3)-(2-hydroxyethyl)adenine (N(3)-HOEtA). Chromatographic separation was achieved on a hydrophilic interaction liquid chromatography column, with a mobile phase gradient prepared from aqueous 10 mM ammonium formate-acetonitrile (5:95 v/v, pH 4.0). Quantification of the analytes was done by multiple reaction monitoring using a triple-quadrupole mass spectrometer in positive-ionization mode. The limits of quantification were 0.13, 0.02, and 0.03 ng/mL for N(3)-MeA, N(3)-EtA, and N(3)-HOEtA, respectively. Intraday and interday variations (relative standard deviations) ranged from 0.6 to 1.3 % and from 3.7 to 7.5 %. The recovery ranges of N(3)-MeA, N(3)-EtA, and N(3)-HOEtA in urine were 80.1-97.3 %, 83.3-90.0 %, and 100.0-110.0 %, respectively. The proposed method was successfully applied to urine samples from 251 volunteers including 193 regular smokers and 58 nonsmokers. The results showed that the levels of urinary N(3)-MeA, N(3)-EtA, and N(3)-HOEtA in smokers were significantly higher than those in nonsmokers. Furthermore, the level of urinary N(3)-MeA in smokers was found to be positively correlated with the level of 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (r = 0.48, P < 0.001, N = 192). This method is appropriate for routine analysis and accurate quantification of N(3)-MeA, N(3)-EtA, and N(3)-HOEtA. It is also a useful tool for the surveillance of alkylating agent exposure.

  14. Development and validation of a selective, sensitive and stability indicating UPLC-MS/MS method for rapid, simultaneous determination of six process related impurities in darunavir drug substance.

    PubMed

    A, Vijaya Bhaskar Reddy; Yusop, Zulkifli; Jaafar, Jafariah; Aris, Azmi B; Majid, Zaiton A; Umar, Khalid; Talib, Juhaizah

    2016-09-05

    In this study a sensitive and selective gradient reverse phase UPLC-MS/MS method was developed for the simultaneous determination of six process related impurities viz., Imp-I, Imp-II, Imp-III, Imp-IV, Imp-V and Imp-VI in darunavir. The chromatographic separation was performed on Acquity UPLC BEH C18 (50 mm×2.1mm, 1.7μm) column using gradient elution of acetonitrile-methanol (80:20, v/v) and 5.0mM ammonium acetate containing 0.01% formic acid at a flow rate of 0.4mL/min. Both negative and positive electrospray ionization (ESI) modes were operated simultaneously using multiple reaction monitoring (MRM) for the quantification of all six impurities in darunavir. The developed method was fully validated following ICH guidelines with respect to specificity, linearity, limit of detection (LOD), limit of quantification (LOQ), accuracy, precision, robustness and sample solution stability. The method was able to quantitate Imp-I, Imp-IV, Imp-V at 0.3ppm and Imp-II, Imp-III, and Imp-VI at 0.2ppm with respect to 5.0mg/mL of darunavir. The calibration curves showed good linearity over the concentration range of LOQ to 250% for all six impurities. The correlation coefficient obtained was >0.9989 in all the cases. The accuracy of the method lies between 89.90% and 104.60% for all six impurities. Finally, the method has been successfully applied for three formulation batches of darunavir to determine the above mentioned impurities, however no impurity was found beyond the LOQ. This method is a good quality control tool for the trace level quantification of six process related impurities in darunavir during its synthesis. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Quantifying hydrologic connectivity with measures from the brain neurosciences - a feasibility study

    NASA Astrophysics Data System (ADS)

    Rinderer, Michael; Ali, Genevieve; Larsen, Laurel

    2017-04-01

    While the concept of connectivity is increasingly applied in hydrology and ecology, little agreement exists on its definition and quantification approaches. In contrast, the neurosciences have developed a systematic conceptualization of connectivity and methods to quantify it. In particular, neuroscientists make a clear distinction between: 1) structural connectivity, which is determined by the anatomy of the brain neural network, 2) functional connectivity, that is based on statistical dependencies between neural signals, and 3) effective connectivity, that allows to infer causal relations based on the assumption that "true" interactions occur with a certain time delay. In a similar vein, in hydrology, structural connectivity can be defined as the physical adjacency of landscape elements that are seen as a prerequisite of material transfer, while functional or process connectivity would rather describe interactions or causal relations between spatial adjacency characteristics and temporally varying factors. While hydrologists have suggested methods to derive structural connectivity (SC), the quantification of functional (FC) or effective connectivity (EC) has remained elusive. The goal of the current study was therefore to apply timeseries analysis methods from brain neuroscience to quantify EC and FC among groundwater (n = 34) and stream discharge (n = 1) monitoring sites in a 20-ha Swiss catchment where topography is assumed to be a major driver of connectivity. SC was assessed through influence maps that quantify the percentage of flow from an upslope site to a downslope site by applying a multiple flow direction algorithm. FC was assessed by cross-correlation, total and partial mutual information while EC was quantified via total and partial entropy, Granger causality and a phase slope index. Our results showed that many structural connections were also expressed as functional or effective connections, which is reasonable in a catchment with shallow perched groundwater tables. The differentiation between FC and EC measures allowed us to distinguish between hydrological connectivity (i.e., Darcian fluxes of water) and hydraulic connectivity (i.e. pressure wave-driven processes). However, some FC and EC measures also detected the presence of connectivity despite the absence of SC, which highlights the limits of applying brain connectivity measures to hydrology. We therefore conclude that brain neuroscience methods for assessing FC and EC can be powerful tools in assessing hydrological connectivity as long as they are constrained by SC measures.

  16. Rapid and simultaneous detection of Salmonella spp., Escherichia coli O157, and Listeria monocytogenes by magnetic capture hybridization and multiplex real-time PCR.

    PubMed

    Carloni, Elisa; Rotundo, Luca; Brandi, Giorgio; Amagliani, Giulia

    2018-05-25

    The application of rapid, specific, and sensitive methods for pathogen detection and quantification is very advantageous in diagnosis of human pathogens in several applications, including food analysis. The aim of this study was the evaluation of a method for the multiplexed detection and quantification of three significant foodborne pathogenic species (Escherichia coli O157, Salmonella spp., and Listeria monocytogenes). The assay combines specific DNA extraction by multiplex magnetic capture hybridization (mMCH) with multiplex real-time PCR. The amplification assay showed linearity in the range 10 6 -10 genomic units (GU)/PCR for each co-amplified species. The sensitivity corresponded to 1 GU/PCR for E. coli O157 and L. monocytogenes, and 10 GU/PCR for Salmonella spp. The immobilization process and the hybrid capture of the MCH showed good efficiency and reproducibility for all targets, allowing the combination in equal amounts of the different nanoparticle types in mMCH. MCH and mMCH efficiencies were similar. The detection limit of the method was 10 CFU in samples with individual pathogens and 10 2  CFU in samples with combination of the three pathogens in unequal amounts (amount's differences of 2 or 3 log). In conclusion, this multiplex molecular platform can be applied to determine the presence of target species in food samples after culture enrichment. In this way, this method could be a time-saving and sensitive tool to be used in routine diagnosis.

  17. Extended quantification of the generalized recurrence plot

    NASA Astrophysics Data System (ADS)

    Riedl, Maik; Marwan, Norbert; Kurths, Jürgen

    2016-04-01

    The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing structures, turbulent spatial plankton patterns, and fractals. But, it is also successfully applied to the description of spatio-temporal dynamics and the detection of regime shifts, such as in the complex Ginzburg-Landau- equation. The recurrence plot based determinism is a central measure in this framework quantifying the level of regularities in temporal and spatial structures. We extend this measure for the generalized recurrence plot considering additional operations of symmetry than the simple translation. It is tested not only on two-dimensional regular patterns and noise but also on complex spatial patterns reconstructing the parameter space of the complex Ginzburg-Landau-equation. The extended version of the determinism resulted in values which are consistent to the original recurrence plot approach. Furthermore, the proposed method allows a split of the determinism into parts which based on laminar and non-laminar regions of the two-dimensional pattern of the complex Ginzburg-Landau-equation. A comparison of these parts with a standard method of image classification, the co-occurrence matrix approach, shows differences especially in the description of patterns associated with turbulence. In that case, it seems that the extended version of the determinism allows a distinction of phase turbulence and defect turbulence by means of their spatial patterns. This ability of the proposed method promise new insights in other systems with turbulent dynamics coming from climatology, biology, ecology, and social sciences, for example.

  18. Quantification of collagen distributions in rat hyaline and fibro cartilages based on second harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Zhu, Xiaoqin; Liao, Chenxi; Wang, Zhenyu; Zhuo, Shuangmu; Liu, Wenge; Chen, Jianxin

    2016-10-01

    Hyaline cartilage is a semitransparent tissue composed of proteoglycan and thicker type II collagen fibers, while fibro cartilage large bundles of type I collagen besides other territorial matrix and chondrocytes. It is reported that the meniscus (fibro cartilage) has a greater capacity to regenerate and close a wound compared to articular cartilage (hyaline cartilage). And fibro cartilage often replaces the type II collagen-rich hyaline following trauma, leading to scar tissue that is composed of rigid type I collagen. The visualization and quantification of the collagen fibrillar meshwork is important for understanding the role of fibril reorganization during the healing process and how different types of cartilage contribute to wound closure. In this study, second harmonic generation (SHG) microscope was applied to image the articular and meniscus cartilage, and textural analysis were developed to quantify the collagen distribution. High-resolution images were achieved based on the SHG signal from collagen within fresh specimens, and detailed observations of tissue morphology and microstructural distribution were obtained without shrinkage or distortion. Textural analysis of SHG images was performed to confirm that collagen in fibrocartilage showed significantly coarser compared to collagen in hyaline cartilage (p < 0.01). Our results show that each type of cartilage has different structural features, which may significantly contribute to pathology when damaged. Our findings demonstrate that SHG microscopy holds potential as a clinically relevant diagnostic tool for imaging degenerative tissues or assessing wound repair following cartilage injury.

  19. Risk assessment of oil spills along the Mediterranean coast: A sensitivity analysis of the choice of hazard quantification.

    PubMed

    Al Shami, A; Harik, G; Alameddine, I; Bruschi, D; Garcia, D Astiaso; El-Fadel, M

    2017-01-01

    Oil pollution in the Mediterranean represents a serious threat to the coastal environment. Quantifying the risks associated with a potential spill is often based on results generated from oil spill models. In this study, MEDSLIK-II, an EU funded and endorsed oil spill model, is used to assess potential oil spill scenarios at four pilot areas located along the northern, eastern, and southern Mediterranean shoreline, providing a wide range of spill conditions and coastal geomorphological characteristics. Oil spill risk assessment at the four pilot areas was quantified as a function of three oil pollution metrics that include the susceptibility of oiling per beach segment, the average volume of oiling expected in the event of beaching, and the average oil beaching time. The results show that while the three pollution metrics tend to agree in their hazard characterization when the shoreline morphology is simple, considerable differences in the quantification of the associated hazard is possible under complex coastal morphologies. These differences proved to greatly alter the evaluation of environmental risks. An integrative hazard index is proposed that encompasses the three simulated pollution metrics. The index promises to shed light on oil spill hazards that can be universally applied across the Mediterranean basin by integrating it with the unified oil spill risk assessment tool developed by the Regional Marine Pollution Emergency Response Centre for the Mediterranean (REMPEC). Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  1. Recurrence quantification analysis of electroencephalograph signals during standard tasks of Waterloo-Stanford group scale of hypnotic susceptibility.

    PubMed

    Yargholi, Elahe'; Nasrabadi, Ali Motie

    2015-01-01

    The purpose of this study was to apply RQA (recurrence quantification analysis) on hypnotic electroencephalograph (EEG) signals recorded after hypnotic induction while subjects were doing standard tasks of the Waterloo-Stanford Group Scale (WSGS) of hypnotic susceptibility. Then recurrence quantifiers were used to analyse the influence of hypnotic depth on EEGs. By the application of this method, the capability of tasks to distinguish subjects of different hypnotizability levels was determined. Besides, medium hypnotizable subjects showed the highest disposition to be inducted by hypnotizer. Similarities between brain governing dynamics during tasks of the same type were also observed. The present study demonstrated two remarkable innovations; investigating the EEGs of the hypnotized as doing mental tasks of Waterloo-Stanford Group Scale (WSGS) and applying RQA on hypnotic EEGs.

  2. Methodological aspects of multicenter studies with quantitative PET.

    PubMed

    Boellaard, Ronald

    2011-01-01

    Quantification of whole-body FDG PET studies is affected by many physiological and physical factors. Much of the variability in reported standardized uptake value (SUV) data seen in the literature results from the variability in methodology applied among these studies, i.e., due to the use of different scanners, acquisition and reconstruction settings, region of interest strategies, SUV normalization, and/or corrections methods. To date, the variability in applied methodology prohibits a proper comparison and exchange of quantitative FDG PET data. Consequently, the promising role of quantitative PET has been demonstrated in several monocentric studies, but these published results cannot be used directly as a guideline for clinical (multicenter) trials performed elsewhere. In this chapter, the main causes affecting whole-body FDG PET quantification and strategies to minimize its inter-institute variability are addressed.

  3. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki)

    PubMed Central

    2013-01-01

    Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455

  4. Design of RNA splicing analysis null models for post hoc filtering of Drosophila head RNA-Seq data with the splicing analysis kit (Spanki).

    PubMed

    Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian

    2013-11-09

    The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.

  5. Nitrous oxide emissions from agricultural landscapes: quantification tools, policy development, and opportunities for improved management

    NASA Astrophysics Data System (ADS)

    Tonitto, C.; Gurwick, N. P.

    2012-12-01

    Policy initiatives to reduce greenhouse gas emissions (GHG) have promoted the development of agricultural management protocols to increase SOC storage and reduce GHG emissions. We review approaches for quantifying N2O flux from agricultural landscapes. We summarize the temporal and spatial extent of observations across representative soil classes, climate zones, cropping systems, and management scenarios. We review applications of simulation and empirical modeling approaches and compare validation outcomes across modeling tools. Subsequently, we review current model application in agricultural management protocols. In particular, we compare approaches adapted for compliance with the California Global Warming Solutions Act, the Alberta Climate Change and Emissions Management Act, and by the American Carbon Registry. In the absence of regional data to drive model development, policies that require GHG quantification often use simple empirical models based on highly aggregated data of N2O flux as a function of applied N - Tier 1 models according to IPCC categorization. As participants in development of protocols that could be used in carbon offset markets, we observed that stakeholders outside of the biogeochemistry community favored outcomes from simulation modeling (Tier 3) rather than empirical modeling (Tier 2). In contrast, scientific advisors were more accepting of outcomes based on statistical approaches that rely on local observations, and their views sometimes swayed policy practitioners over the course of policy development. Both Tier 2 and Tier 3 approaches have been implemented in current policy development, and it is important that the strengths and limitations of both approaches, in the face of available data, be well-understood by those drafting and adopting policies and protocols. The reliability of all models is contingent on sufficient observations for model development and validation. Simulation models applied without site-calibration generally result in poor validation results, and this point particularly needs to be emphasized during policy development. For cases where sufficient calibration data are available, simulation models have demonstrated the ability to capture seasonal patterns of N2O flux. The reliability of statistical models likewise depends on data availability. Because soil moisture is a significant driver of N2O flux, the best outcomes occur when empirical models are applied to systems with relevant soil classification and climate. The structure of current carbon offset protocols is not well-aligned with a budgetary approach to GHG accounting. Current protocols credit field-scale reduction in N2O flux as a result of reduced fertilizer use. Protocols do not award farmers credit for reductions in CO2 emissions resulting from reduced production of synthetic N fertilizer. To achieve the greatest GHG emission reductions through reduced synthetic N production and reduced landscape N saturation requires a re-envisioning of the agricultural landscape to include cropping systems with legume and manure N sources. The current focus on on-farm GHG sources focuses credits on simple reductions of N applied in conventional systems rather than on developing cropping systems which promote higher recycling and retention of N.

  6. Validated reverse transcription droplet digital PCR serves as a higher order method for absolute quantification of Potato virus Y strains.

    PubMed

    Mehle, Nataša; Dobnik, David; Ravnikar, Maja; Pompe Novak, Maruša

    2018-05-03

    RNA viruses have a great potential for high genetic variability and rapid evolution that is generated by mutation and recombination under selection pressure. This is also the case of Potato virus Y (PVY), which comprises a high diversity of different recombinant and non-recombinant strains. Consequently, it is hard to develop reverse transcription real-time quantitative PCR (RT-qPCR) with the same amplification efficiencies for all PVY strains which would enable their equilibrate quantification; this is specially needed in mixed infections and other studies of pathogenesis. To achieve this, we initially transferred the PVY universal RT-qPCR assay to a reverse transcription droplet digital PCR (RT-ddPCR) format. RT-ddPCR is an absolute quantification method, where a calibration curve is not needed, and it is less prone to inhibitors. The RT-ddPCR developed and validated in this study achieved a dynamic range of quantification over five orders of magnitude, and in terms of its sensitivity, it was comparable to, or even better than, RT-qPCR. RT-ddPCR showed lower measurement variability. We have shown that RT-ddPCR can be used as a reference tool for the evaluation of different RT-qPCR assays. In addition, it can be used for quantification of RNA based on in-house reference materials that can then be used as calibrators in diagnostic laboratories.

  7. Advances in targeted proteomics and applications to biomedical research

    PubMed Central

    Shi, Tujin; Song, Ehwang; Nie, Song; Rodland, Karin D.; Liu, Tao; Qian, Wei-Jun; Smith, Richard D.

    2016-01-01

    Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications in human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed. PMID:27302376

  8. AQuA: An Automated Quantification Algorithm for High-Throughput NMR-Based Metabolomics and Its Application in Human Plasma.

    PubMed

    Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A

    2018-02-06

    A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.

  9. Advances in targeted proteomics and applications to biomedical research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Song, Ehwang; Nie, Song

    Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity (Shi et al., Proteomics, 12, 1074–1092, 2012) herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications inmore » human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed.« less

  10. Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.

    PubMed

    Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2018-02-01

    Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Evaluation of sketch-level VMT quantification tools : strategic growth council grant programs evaluation support project.

    DOT National Transportation Integrated Search

    2017-08-01

    The State of California has enacted ambitious policies that aim to reduce the states greenhouse gas (GHG) emissions. Some of these policies focus on reducing the amount of driving throughout the state, measured in vehicle miles traveled (VMT), giv...

  12. Multiplex quantification of protein toxins in human biofluids and food matrices using immunoextraction and high-resolution targeted mass spectrometry.

    PubMed

    Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François

    2015-08-18

    The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.

  13. Precision and accuracy of clinical quantification of myocardial blood flow by dynamic PET: A technical perspective.

    PubMed

    Moody, Jonathan B; Lee, Benjamin C; Corbett, James R; Ficaro, Edward P; Murthy, Venkatesh L

    2015-10-01

    A number of exciting advances in PET/CT technology and improvements in methodology have recently converged to enhance the feasibility of routine clinical quantification of myocardial blood flow and flow reserve. Recent promising clinical results are pointing toward an important role for myocardial blood flow in the care of patients. Absolute blood flow quantification can be a powerful clinical tool, but its utility will depend on maintaining precision and accuracy in the face of numerous potential sources of methodological errors. Here we review recent data and highlight the impact of PET instrumentation, image reconstruction, and quantification methods, and we emphasize (82)Rb cardiac PET which currently has the widest clinical application. It will be apparent that more data are needed, particularly in relation to newer PET technologies, as well as clinical standardization of PET protocols and methods. We provide recommendations for the methodological factors considered here. At present, myocardial flow reserve appears to be remarkably robust to various methodological errors; however, with greater attention to and more detailed understanding of these sources of error, the clinical benefits of stress-only blood flow measurement may eventually be more fully realized.

  14. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    NASA Astrophysics Data System (ADS)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  15. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection.

    PubMed

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-14

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  16. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    PubMed

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  17. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  18. Stereomicroscopic imaging technique for the quantification of cold flow in drug-in-adhesive type of transdermal drug delivery systems.

    PubMed

    Krishnaiah, Yellela S R; Katragadda, Usha; Khan, Mansoor A

    2014-05-01

    Cold flow is a phenomenon occurring in drug-in-adhesive type of transdermal drug delivery systems (DIA-TDDS) because of the migration of DIA coat beyond the edge. Excessive cold flow can affect their therapeutic effectiveness, make removal of DIA-TDDS difficult from the pouch, and potentially decrease available dose if any drug remains adhered to pouch. There are no compendial or noncompendial methods available for quantification of this critical quality attribute. The objective was to develop a method for quantification of cold flow using stereomicroscopic imaging technique. Cold flow was induced by applying 1 kg force on punched-out samples of marketed estradiol DIA-TDDS (model product) stored at 25°C, 32°C, and 40°C/60% relative humidity (RH) for 1, 2, or 3 days. At the end of testing period, dimensional change in the area of DIA-TDDS samples was measured using image analysis software, and expressed as percent of cold flow. The percent of cold flow significantly decreased (p < 0.001) with increase in size of punched-out DIA-TDDS samples and increased (p < 0.001) with increase in cold flow induction temperature and time. This first ever report suggests that dimensional change in the area of punched-out samples stored at 32°C/60%RH for 2 days applied with 1 kg force could be used for quantification of cold flow in DIA-TDDS. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  19. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris.

    PubMed

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.

  20. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris

    PubMed Central

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest. PMID:26284241

  1. Enantiomeric analysis of overlapped chromatographic profiles in the presence of interferences. Determination of ibuprofen in a pharmaceutical formulation containing homatropine.

    PubMed

    Padró, J M; Osorio-Grisales, J; Arancibia, J A; Olivieri, A C; Castells, C B

    2016-10-07

    In this work, we studied the combination of chemometric methods with chromatographic separations as a strategy applied to the analysis of enantiomers when complete enantioseparation is difficult or requires long analysis times and, in addition, the target signals have interference from the matrix. We present the determination of ibuprofen enantiomers in pharmaceutical formulations containing homatropine as interference by chiral HPLC-DAD detection in combination with partial least-squares algorithms. The method has been applied to samples containing enantiomeric ratios from 95:5 to 99.5:0.5 and coelution of interferents. The results were validated using univariate calibration and without homatropine. Relative error of the method was less than 4.0%, for both enantiomers. Limits of detection (LOD) and quantification (LOQ) for (S)-(+)-ibuprofen were 4.96×10 -10 and 1.50×10 -9 mol, respectively. LOD and LOQ for the R-(-)-ibuprofen were LOD=1.60×10 -11 mol and LOQ=4.85×10 -11 mol, respectively. Finally, the chemometric method was applied to the determination of enantiomeric purity of commercial pharmaceuticals. The ultimate goal of this research was the development of rapid, reliable, and robust methods for assessing enantiomeric purity by conventional diode array detector assisted by chemometric tools. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Assessment of the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Liu, nan-Suey; Iannetti, Anthony; Shih, Tsan-Hsing

    2007-01-01

    The advancements made during the last decade in the areas of combustion modeling, numerical simulation, and computing platform have greatly facilitated the use of CFD based tools in the development of combustion technology. Further development of verification, validation and uncertainty quantification will have profound impact on the reliability and utility of these CFD based tools. The objectives of the present effort are to establish baseline for the National Combustion Code (NCC) and experimental data, as well as to document current capabilities and identify gaps for further improvements.

  3. RNA-Skim: a rapid method for RNA-Seq quantification at transcript level

    PubMed Central

    Zhang, Zhaojun; Wang, Wei

    2014-01-01

    Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995

  4. Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.

    PubMed

    Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio

    2018-01-01

    Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.

  5. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  6. Attractor structure discriminates sleep states: recurrence plot analysis applied to infant breathing patterns.

    PubMed

    Terrill, Philip Ian; Wilson, Stephen James; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2010-05-01

    Breathing patterns are characteristically different between infant active sleep (AS) and quiet sleep (QS), and statistical quantifications of interbreath interval (IBI) data have previously been used to discriminate between infant sleep states. It has also been identified that breathing patterns are governed by a nonlinear controller. This study aims to investigate whether nonlinear quantifications of infant IBI data are characteristically different between AS and QS, and whether they may be used to discriminate between these infant sleep states. Polysomnograms were obtained from 24 healthy infants at six months of age. Periods of AS and QS were identified, and IBI data extracted. Recurrence quantification analysis (RQA) was applied to each period, and recurrence calculated for a fixed radius in the range of 0-8 in steps of 0.02, and embedding dimensions of 4, 6, 8, and 16. When a threshold classifier was trained, the RQA variable recurrence was able to correctly classify 94.3% of periods in a test dataset. It was concluded that RQA of IBI data is able to accurately discriminate between infant sleep states. This is a promising step toward development of a minimal-channel automatic sleep state classification system.

  7. Review of complex networks application in hydroclimatic extremes with an implementation to characterize spatio-temporal drought propagation in continental USA

    NASA Astrophysics Data System (ADS)

    Konapala, Goutam; Mishra, Ashok

    2017-12-01

    The quantification of spatio-temporal hydroclimatic extreme events is a key variable in water resources planning, disaster mitigation, and preparing climate resilient society. However, quantification of these extreme events has always been a great challenge, which is further compounded by climate variability and change. Recently complex network theory was applied in earth science community to investigate spatial connections among hydrologic fluxes (e.g., rainfall and streamflow) in water cycle. However, there are limited applications of complex network theory for investigating hydroclimatic extreme events. This article attempts to provide an overview of complex networks and extreme events, event synchronization method, construction of networks, their statistical significance and the associated network evaluation metrics. For illustration purpose, we apply the complex network approach to study the spatio-temporal evolution of droughts in Continental USA (CONUS). A different drought threshold leads to a new drought event as well as different socio-economic implications. Therefore, it would be interesting to explore the role of thresholds on spatio-temporal evolution of drought through network analysis. In this study, long term (1900-2016) Palmer drought severity index (PDSI) was selected for spatio-temporal drought analysis using three network-based metrics (i.e., strength, direction and distance). The results indicate that the drought events propagate differently at different thresholds associated with initiation of drought events. The direction metrics indicated that onset of mild drought events usually propagate in a more spatially clustered and uniform approach compared to onsets of moderate droughts. The distance metric shows that the drought events propagate for longer distance in western part compared to eastern part of CONUS. We believe that the network-aided metrics utilized in this study can be an important tool in advancing our knowledge on drought propagation as well as other hydroclimatic extreme events. Although the propagation of droughts is investigated using the network approach, however process (physics) based approaches is essential to further understand the dynamics of hydroclimatic extreme events.

  8. 39 CFR 3050.1 - Definitions applicable to this part.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a..., or statistical theory, precept, or assumption. A change in quantification technique should not change...

  9. 39 CFR 3050.1 - Definitions applicable to this part.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a..., or statistical theory, precept, or assumption. A change in quantification technique should not change...

  10. Novel isotopic N, N-Dimethyl Leucine (iDiLeu) Reagents Enable Absolute Quantification of Peptides and Proteins Using a Standard Curve Approach

    NASA Astrophysics Data System (ADS)

    Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun

    2015-01-01

    Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).

  11. 18O-labeled proteome reference as global internal standards for targeted quantification by selected reaction monitoring-mass spectrometry.

    PubMed

    Kim, Jong-Seo; Fillmore, Thomas L; Liu, Tao; Robinson, Errol; Hossain, Mahmud; Champion, Boyd L; Moore, Ronald J; Camp, David G; Smith, Richard D; Qian, Wei-Jun

    2011-12-01

    Selected reaction monitoring (SRM)-MS is an emerging technology for high throughput targeted protein quantification and verification in biomarker discovery studies; however, the cost associated with the application of stable isotope-labeled synthetic peptides as internal standards can be prohibitive for screening a large number of candidate proteins as often required in the preverification phase of discovery studies. Herein we present a proof of concept study using an (18)O-labeled proteome reference as global internal standards (GIS) for SRM-based relative quantification. The (18)O-labeled proteome reference (or GIS) can be readily prepared and contains a heavy isotope ((18)O)-labeled internal standard for every possible tryptic peptide. Our results showed that the percentage of heavy isotope ((18)O) incorporation applying an improved protocol was >99.5% for most peptides investigated. The accuracy, reproducibility, and linear dynamic range of quantification were further assessed based on known ratios of standard proteins spiked into the labeled mouse plasma reference. Reliable quantification was observed with high reproducibility (i.e. coefficient of variance <10%) for analyte concentrations that were set at 100-fold higher or lower than those of the GIS based on the light ((16)O)/heavy ((18)O) peak area ratios. The utility of (18)O-labeled GIS was further illustrated by accurate relative quantification of 45 major human plasma proteins. Moreover, quantification of the concentrations of C-reactive protein and prostate-specific antigen was illustrated by coupling the GIS with standard additions of purified protein standards. Collectively, our results demonstrated that the use of (18)O-labeled proteome reference as GIS provides a convenient, low cost, and effective strategy for relative quantification of a large number of candidate proteins in biological or clinical samples using SRM.

  12. Localized 2D COSY sequences: Method and experimental evaluation for a whole metabolite quantification approach

    NASA Astrophysics Data System (ADS)

    Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène

    2015-11-01

    Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.

  13. Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.

    PubMed

    Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang

    2016-11-01

    Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.

  14. E-learning for textile enterprises innovation improvement

    NASA Astrophysics Data System (ADS)

    Blaga, M.; Harpa, R.; Radulescu, I. R.; Stepjanovic, Z.

    2017-10-01

    The Erasmus Plus project- TEXMatrix: “Matrix of knowledge for innovation and competitiveness in textile enterprises”, financed through the Erasmus+ Programme, Strategic partnerships- KA2 for Vocational Education and Training, aims at spreading the creative and innovative organizational culture inside textile enterprises by transferring and implementing methodologies, tools and concepts for improved training. Five European partners form the project consortium: INCDTP - Bucharest, Romania (coordinator), TecMinho - Portugal, Centrocot - Italy, University Maribor, Slovenia, and “Gheorghe Asachi” Technical University of Iasi, Romania. These will help the textile enterprises involved in the project, to learn how to apply creative thinking in their organizations and how to develop the capacity for innovation and change. The project aims to bridge the gap between textile enterprises need for qualified personnel and the young workforce. It develops an innovative knowledge matrix for the tangible and intangible assets of an enterprise and a benchmarking study, based on which a dedicated software tool will be created. This software tool will aid the decision-making enterprise staff (managers, HR specialists, professionals) as well as the trainees (young employees, students, and scholars) to cope with the new challenges of innovation and competitiveness for the textile field. The purpose of this paper is to present the main objectives and achievements of the project, according to its declared goals, with the focus on the presentation of the knowledge matrix of innovation, which is a powerful instrument for the quantification of the intangible assets of textile enterprises.

  15. Incorporation of Gas Chromatography-Mass Spectrometry into the Undergraduate Organic Chemistry Laboratory Curriculum

    ERIC Educational Resources Information Center

    Giarikos, Dimitrios G.; Patel, Sagir; Lister, Andrew; Razeghifard, Reza

    2013-01-01

    Gas chromatography-mass spectrometry (GC-MS) is a powerful analytical tool for detection, identification, and quantification of many volatile organic compounds. However, many colleges and universities have not fully incorporated this technique into undergraduate teaching laboratories despite its wide application and ease of use in organic…

  16. Inter-laboratory Comparison of Real-time PCR Methods for Quantification of General Fecal Indicator Bacteria

    EPA Science Inventory

    The application of quantitative real-time PCR (qPCR) technologies for the rapid identification of fecal bacteria in environmental waters is being considered for use as a national water quality metric in the United States. The transition from research tool to a standardized prot...

  17. Interlaboratory Comparison of Real-time PCR Protocols for Quantification of General Fecal Indicator Bacteria

    EPA Science Inventory

    The application of quantitative real-time PCR (qPCR) technologies for the rapid identification of fecal bacteria in environmental waters is being considered for use as a national water quality metric in the United States. The transition from research tool to a standardized proto...

  18. Teaching Cardiovascular Physiology with Equivalent Electronic Circuits in a Practically Oriented Teaching Module

    ERIC Educational Resources Information Center

    Ribaric, Samo; Kordas, Marjan

    2011-01-01

    Here, we report on a new tool for teaching cardiovascular physiology and pathophysiology that promotes qualitative as well as quantitative thinking about time-dependent physiological phenomena. Quantification of steady and presteady-state (transient) cardiovascular phenomena is traditionally done by differential equations, but this is time…

  19. APPLICATION OF 3D COMPUTER-AIDED TOMOGRAPHY TO THE QUANTIFICATION OF MARINE SEDIMENT COMMUNITIES IN POLLUTION GRADIENTS

    EPA Science Inventory

    Computer-Aided Tomography (CT) has been demonstrated to be a cost efficient tool for the qualitative and quantitative study of estuarine benthic communities along pollution gradients.
    Now we have advanced this technology to successfully visualize and discriminate three dimen...

  20. Using the SWAT model to improve process descriptions and define hydrologic partitioning in South Korea

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.

    2014-02-01

    Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.

  1. Sensitive multiresidue method by HS-SPME/GC-MS for 10 volatile organic compounds in urine matrix: a new tool for biomonitoring studies on children.

    PubMed

    Antonucci, Arianna; Vitali, Matteo; Avino, Pasquale; Manigrasso, Maurizio; Protano, Carmela

    2016-08-01

    A HS-SPME method coupled with GC-MS analysis has been developed for simultaneously measuring the concentration of 10 volatile organic compounds (VOCs) (benzene, toluene, ethylbenzene, o-, m-, and p-xylene, methyl tert-butyl ether, ethyl tert-butyl ether, 2-methyl-2-butyl methyl ether, and diisopropyl ether) in urine matrix as a biomonitoring tool for populations at low levels of exposure to such VOCs. These compounds, potentially toxic for human health, are common contaminants of both outdoor and indoor air, as they are released by autovehicular traffic; some of them are also present in environmental tobacco smoke (ETS). Thus, the exposure to these pollutants cannot be neglected and should be assessed. The low limits of detection and quantification (LODs and LOQs <6.5 and 7.5 ng L(-1), respectively) and the high reproducibility (CVs <4 %) make the developed method suited for biomonitoring populations exposed at low levels such as children. Further, the method is cost-effective and low in time-consumption; therefore, it is useful for investigating large populations. It has been applied to children exposed to traffic pollution and/or ETS; the relevant results are reported, and the relevant implications are discussed.

  2. Molecular Tools for the Detection of Nitrogen Cycling Archaea

    PubMed Central

    Rusch, Antje

    2013-01-01

    Archaea are widespread in extreme and temperate environments, and cultured representatives cover a broad spectrum of metabolic capacities, which sets them up for potentially major roles in the biogeochemistry of their ecosystems. The detection, characterization, and quantification of archaeal functions in mixed communities require Archaea-specific primers or probes for the corresponding metabolic genes. Five pairs of degenerate primers were designed to target archaeal genes encoding key enzymes of nitrogen cycling: nitrite reductases NirA and NirB, nitrous oxide reductase (NosZ), nitrogenase reductase (NifH), and nitrate reductases NapA/NarG. Sensitivity towards their archaeal target gene, phylogenetic specificity, and gene specificity were evaluated in silico and in vitro. Owing to their moderate sensitivity/coverage, the novel nirB-targeted primers are suitable for pure culture studies only. The nirA-targeted primers showed sufficient sensitivity and phylogenetic specificity, but poor gene specificity. The primers designed for amplification of archaeal nosZ performed well in all 3 criteria; their discrimination against bacterial homologs appears to be weakened when Archaea are strongly outnumbered by bacteria in a mixed community. The novel nifH-targeted primers showed high sensitivity and gene specificity, but failed to discriminate against bacterial homologs. Despite limitations, 4 of the new primer pairs are suitable tools in several molecular methods applied in archaeal ecology. PMID:23365509

  3. A simple teaching tool for training the pelvic organ prolapse quantification system.

    PubMed

    Geiss, Ingrid M; Riss, Paul A; Hanzal, Engelbert; Dungl, Andrea

    2007-09-01

    The pelvic organ prolapse quantification (POPQ) system is currently the most common and specific system describing different prolapse stages. Nevertheless, its use is not yet accepted worldwide in routine care. Our aim was to develop a simple teaching tool for the POPQ system capable of simulating different stages of uterovaginal prolapse for use in medical education with hands on training. We constructed a moveable and flexible tool with an inverted Santa Claus' cap, which simulated the vaginal cuff and the tassel at the end representing the cervix. A wooden embroidery frame fixed the cap and served as the hymen, the reference point for all measurements. Inside the cap, we sewed buttons to define the anatomic landmark points Aa and Ap located 3 cm distal from the frame. After explaining the device to the students, we used the three-by-three grid for recording the quantitative description of the pelvic organ support. First, each student had to demonstrate a specific prolapse with his cap device. Then, a prolapse was simulated on the cap, and the student had to take the relevant measurements and record them in the POPQ grid. The main training effect to understand the POPQ system seems to be the possibility for each trainee to simulate a three-dimensional prolapse with this flexible vagina model.

  4. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Bhat, Kabekode Ghanasham

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  5. Analysis of ISO NE Balancing Requirements: Uncertainty-based Secure Ranges for ISO New England Dynamic Inerchange Adjustments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Makarov, Yuri V.; Wu, Di

    The document describes detailed uncertainty quantification (UQ) methodology developed by PNNL to estimate secure ranges of potential dynamic intra-hour interchange adjustments in the ISO-NE system and provides description of the dynamic interchange adjustment (DINA) tool developed under the same contract. The overall system ramping up and down capability, spinning reserve requirements, interchange schedules, load variations and uncertainties from various sources that are relevant to the ISO-NE system are incorporated into the methodology and the tool. The DINA tool has been tested by PNNL and ISO-NE staff engineers using ISO-NE data.

  6. A three-dimensional inverse finite element analysis of the heel pad.

    PubMed

    Chokhandre, Snehal; Halloran, Jason P; van den Bogert, Antonie J; Erdemir, Ahmet

    2012-03-01

    Quantification of plantar tissue behavior of the heel pad is essential in developing computational models for predictive analysis of preventive treatment options such as footwear for patients with diabetes. Simulation based studies in the past have generally adopted heel pad properties from the literature, in return using heel-specific geometry with material properties of a different heel. In exceptional cases, patient-specific material characterization was performed with simplified two-dimensional models, without further evaluation of a heel-specific response under different loading conditions. The aim of this study was to conduct an inverse finite element analysis of the heel in order to calculate heel-specific material properties in situ. Multidimensional experimental data available from a previous cadaver study by Erdemir et al. ("An Elaborate Data Set Characterizing the Mechanical Response of the Foot," ASME J. Biomech. Eng., 131(9), pp. 094502) was used for model development, optimization, and evaluation of material properties. A specimen-specific three-dimensional finite element representation was developed. Heel pad material properties were determined using inverse finite element analysis by fitting the model behavior to the experimental data. Compression dominant loading, applied using a spherical indenter, was used for optimization of the material properties. The optimized material properties were evaluated through simulations representative of a combined loading scenario (compression and anterior-posterior shear) with a spherical indenter and also of a compression dominant loading applied using an elevated platform. Optimized heel pad material coefficients were 0.001084 MPa (μ), 9.780 (α) (with an effective Poisson's ratio (ν) of 0.475), for a first-order nearly incompressible Ogden material model. The model predicted structural response of the heel pad was in good agreement for both the optimization (<1.05% maximum tool force, 0.9% maximum tool displacement) and validation cases (6.5% maximum tool force, 15% maximum tool displacement). The inverse analysis successfully predicted the material properties for the given specimen-specific heel pad using the experimental data for the specimen. The modeling framework and results can be used for accurate predictions of the three-dimensional interaction of the heel pad with its surroundings.

  7. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  8. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghanem, Roger

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less

  9. A Short Interspersed Nuclear Element (SINE)-Based Real-Time PCR Approach to Detect and Quantify Porcine Component in Meat Products.

    PubMed

    Zhang, Chi; Fang, Xin; Qiu, Haopu; Li, Ning

    2015-01-01

    Real-time PCR amplification of mitochondria gene could not be used for DNA quantification, and that of single copy DNA did not allow an ideal sensitivity. Moreover, cross-reactions among similar species were commonly observed in the published methods amplifying repetitive sequence, which hindered their further application. The purpose of this study was to establish a short interspersed nuclear element (SINE)-based real-time PCR approach having high specificity for species detection that could be used in DNA quantification. After massive screening of candidate Sus scrofa SINEs, one optimal combination of primers and probe was selected, which had no cross-reaction with other common meat species. LOD of the method was 44 fg DNA/reaction. Further, quantification tests showed this approach was practical in DNA estimation without tissue variance. Thus, this study provided a new tool for qualitative detection of porcine component, which could be promising in the QC of meat products.

  10. High sensitivity mass spectrometric quantification of serum growth hormone by amphiphilic peptide conjugation

    NASA Astrophysics Data System (ADS)

    Arsene, Cristian G.; Schulze, Dirk; Kratzsch, Jürgen; Henrion, André

    2012-12-01

    Amphiphilic peptide conjugation affords a significant increase in sensitivity with protein quantification by electrospray-ionization mass spectrometry. This has been demonstrated here for human growth hormone in serum using N-(3-iodopropyl)-N,N,N-dimethyloctylammonium iodide (IPDOA-iodide) as derivatizing reagent. The signal enhancement achieved in comparison to the method without derivatization enables extension of the applicable concentration range down to the very low concentrations as encountered with clinical glucose suppression tests for patients with acromegaly. The method has been validated using a set of serum samples spiked with known amounts of recombinant 22 kDa growth hormone in the range of 0.48 to 7.65 \\mug/L. The coefficient of variation (CV) calculated, based on the deviation of results from the expected concentrations, was 3.5% and the limit of quantification (LoQ) was determined as 0.4 \\mug/L. The potential of the method as a tool in clinical practice has been demonstrated with patient samples of about 1 \\mug/L.

  11. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  12. Illite polytype quantification using Wildfire© calculated x-ray diffraction patterns

    USGS Publications Warehouse

    Grathoff, Georg H.; Moore, D.M.

    1996-01-01

    Illite polytype quantification allows the differentiation of diagenetic and detrital illite components. In Paleozoic shales from the Illinois Basin, we observe 3 polytypes: 1Md, 1M and 2M1. 1Md and 1M are of diagenetic origin and 2M1 is of detrital origin. In this paper, we compare experimental X-ray diffraction (XRD) traces with traces calculated using WILDFIRE© and quantify mixtures of all 3 polytypes, adjusting the effects of preferred orientation and overlapping peaks. The broad intensity (“illite hump”) around the illite 003, which is very common in illite from shales, is caused by the presence of 1Md illite and mixing of illite polytypes and is not an artifact of sample preparation or other impurities in the sample. Illite polytype quantification provides a tool to extrapolate the K/Ar age and chemistry of the detrital and diagenetic end-members by analysis of different size fractions containing different proportions of diagenetic and detrital illite polytypes.

  13. Repeatability of Bolus Kinetics Ultrasound Perfusion Imaging for the Quantification of Cerebral Blood Flow.

    PubMed

    Vinke, Elisabeth J; Eyding, Jens; de Korte, Chris L; Slump, Cornelis H; van der Hoeven, Johannes G; Hoedemaekers, Cornelia W E

    2017-12-01

    Ultrasound perfusion imaging (UPI) can be used for the quantification of cerebral perfusion. In a neuro-intensive care setting, repeated measurements are required to evaluate changes in cerebral perfusion and monitor therapy. The aim of this study was to determine the repeatability of UPI in quantification of cerebral perfusion. UPI measurement of cerebral perfusion was performed three times in healthy patients. The coefficients of variation of the three bolus injections were calculated for both time- and volume-derived perfusion parameters in the macro- and microcirculation. The UPI time-dependent parameters had overall the lowest CVs in both the macro- and microcirculation. The volume-related parameters had poorer repeatability, especially in the microcirculation. Both intra-observer variability and inter-observer variability were low. Although UPI is a promising tool for the bedside measurement of cerebral perfusion, improvement of the technique is required before implementation in routine clinical practice. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  14. Effects of Frequency Drift on the Quantification of Gamma-Aminobutyric Acid Using MEGA-PRESS

    NASA Astrophysics Data System (ADS)

    Tsai, Shang-Yueh; Fang, Chun-Hao; Wu, Thai-Yu; Lin, Yi-Ru

    2016-04-01

    The MEGA-PRESS method is the most common method used to measure γ-aminobutyric acid (GABA) in the brain at 3T. It has been shown that the underestimation of the GABA signal due to B0 drift up to 1.22 Hz/min can be reduced by post-frequency alignment. In this study, we show that the underestimation of GABA can still occur even with post frequency alignment when the B0 drift is up to 3.93 Hz/min. The underestimation can be reduced by applying a frequency shift threshold. A total of 23 subjects were scanned twice to assess the short-term reproducibility, and 14 of them were scanned again after 2-8 weeks to evaluate the long-term reproducibility. A linear regression analysis of the quantified GABA versus the frequency shift showed a negative correlation (P < 0.01). Underestimation of the GABA signal was found. When a frequency shift threshold of 0.125 ppm (15.5 Hz or 1.79 Hz/min) was applied, the linear regression showed no statistically significant difference (P > 0.05). Therefore, a frequency shift threshold at 0.125 ppm (15.5 Hz) can be used to reduce underestimation during GABA quantification. For data with a B0 drift up to 3.93 Hz/min, the coefficients of variance of short-term and long-term reproducibility for the GABA quantification were less than 10% when the frequency threshold was applied.

  15. A Drosophila Toolkit for the Visualization and Quantification of Viral Replication Launched from Transgenic Genomes

    PubMed Central

    Wernet, Mathias F.; Klovstad, Martha; Clandinin, Thomas R.

    2014-01-01

    Arthropod RNA viruses pose a serious threat to human health, yet many aspects of their replication cycle remain incompletely understood. Here we describe a versatile Drosophila toolkit of transgenic, self-replicating genomes (‘replicons’) from Sindbis virus that allow rapid visualization and quantification of viral replication in vivo. We generated replicons expressing Luciferase for the quantification of viral replication, serving as useful new tools for large-scale genetic screens for identifying cellular pathways that influence viral replication. We also present a new binary system in which replication-deficient viral genomes can be activated ‘in trans’, through co-expression of an intact replicon contributing an RNA-dependent RNA polymerase. The utility of this toolkit for studying virus biology is demonstrated by the observation of stochastic exclusion between replicons expressing different fluorescent proteins, when co-expressed under control of the same cellular promoter. This process is analogous to ‘superinfection exclusion’ between virus particles in cell culture, a process that is incompletely understood. We show that viral polymerases strongly prefer to replicate the genome that encoded them, and that almost invariably only a single virus genome is stochastically chosen for replication in each cell. Our in vivo system now makes this process amenable to detailed genetic dissection. Thus, this toolkit allows the cell-type specific, quantitative study of viral replication in a genetic model organism, opening new avenues for molecular, genetic and pharmacological dissection of virus biology and tool development. PMID:25386852

  16. Magnetic resonance cell-tracking studies: spectrophotometry-based method for the quantification of cellular iron content after loading with superparamagnetic iron oxide nanoparticles.

    PubMed

    Böhm, Ingrid

    2011-08-01

    The purpose of this article is to present a user-friendly tool for quantifying the iron content of superparamagnetic labeled cells before cell tracking by magnetic resonance imaging (MRI). Iron quantification was evaluated by using Prussian blue staining and spectrophotometry. White blood cells were labeled with superparamagnetic iron oxide (SPIO) nanoparticles. Labeling was confirmed by light microscopy. Subsequently, the cells were embedded in a phantom and scanned on a 3 T magnetic resonance tomography (MRT) whole-body system. Mean peak wavelengths λ(peak) was determined at A(720 nm) (range 719-722 nm). Linearity was proven for the measuring range 0.5 to 10 μg Fe/mL (r  =  .9958; p  =  2.2 × 10(-12)). The limit of detection was 0.01 μg Fe/mL (0.1785 mM), and the limit of quantification was 0.04 μg Fe/mL (0.714 mM). Accuracy was demonstrated by comparison with atomic absorption spectrometry. Precision and robustness were also proven. On T(2)-weighted images, signal intensity varied according to the iron concentration of SPIO-labeled cells. Absorption spectrophotometry is both a highly sensitive and user-friendly technique that is feasible for quantifying the iron content of magnetically labeled cells. The presented data suggest that spectrophotometry is a promising tool for promoting the implementation of magnetic resonance-based cell tracking in routine clinical applications (from bench to bedside).

  17. Identification and Affinity-Quantification of ß-Amyloid and α-Synuclein Polypeptides Using On-Line SAW-Biosensor-Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Slamnoiu, Stefan; Vlad, Camelia; Stumbaum, Mihaela; Moise, Adrian; Lindner, Kathrin; Engel, Nicole; Vilanova, Mar; Diaz, Mireia; Karreman, Christiaan; Leist, Marcel; Ciossek, Thomas; Hengerer, Bastian; Vilaseca, Marta; Przybylski, Michael

    2014-08-01

    Bioaffinity analysis using a variety of biosensors has become an established tool for detection and quantification of biomolecular interactions. Biosensors, however, are generally limited by the lack of chemical structure information of affinity-bound ligands. On-line bioaffinity-mass spectrometry using a surface-acoustic wave biosensor (SAW-MS) is a new combination providing the simultaneous affinity detection, quantification, and mass spectrometric structural characterization of ligands. We describe here an on-line SAW-MS combination for direct identification and affinity determination, using a new interface for MS of the affinity-isolated ligand eluate. Key element of the SAW-MS combination is a microfluidic interface that integrates affinity-isolation on a gold chip, in-situ sample concentration, and desalting with a microcolumn for MS of the ligand eluate from the biosensor. Suitable MS- acquisition software has been developed that provides coupling of the SAW-MS interface to a Bruker Daltonics ion trap-MS, FTICR-MS, and Waters Synapt-QTOF- MS systems. Applications are presented for mass spectrometric identifications and affinity (KD) determinations of the neurodegenerative polypeptides, ß-amyloid (Aß), and pathophysiological and physiological synucleins (α- and ß-synucleins), two key polypeptide systems for Alzheimer's disease and Parkinson's disease, respectively. Moreover, first in vivo applications of αSyn polypeptides from brain homogenate show the feasibility of on-line affinity-MS to the direct analysis of biological material. These results demonstrate on-line SAW-bioaffinity-MS as a powerful tool for structural and quantitative analysis of biopolymer interactions.

  18. Machine Detection of Enhanced Electromechanical Energy Conversion in PbZr 0.2Ti 0.8O 3 Thin Films

    DOE PAGES

    Agar, Joshua C.; Cao, Ye; Naul, Brett; ...

    2018-05-28

    Many energy conversion, sensing, and microelectronic applications based on ferroic materials are determined by the domain structure evolution under applied stimuli. New hyperspectral, multidimensional spectroscopic techniques now probe dynamic responses at relevant length and time scales to provide an understanding of how these nanoscale domain structures impact macroscopic properties. Such approaches, however, remain limited in use because of the difficulties that exist in extracting and visualizing scientific insights from these complex datasets. Using multidimensional band-excitation scanning probe spectroscopy and adapting tools from both computer vision and machine learning, an automated workflow is developed to featurize, detect, and classify signatures ofmore » ferroelectric/ferroelastic switching processes in complex ferroelectric domain structures. This approach enables the identification and nanoscale visualization of varied modes of response and a pathway to statistically meaningful quantification of the differences between those modes. Lastly, among other things, the importance of domain geometry is spatially visualized for enhancing nanoscale electromechanical energy conversion.« less

  19. Chiral EFT based nuclear forces: achievements and challenges

    NASA Astrophysics Data System (ADS)

    Machleidt, R.; Sammarruca, F.

    2016-08-01

    During the past two decades, chiral effective field theory has become a popular tool to derive nuclear forces from first principles. Two-nucleon interactions have been worked out up to sixth order of chiral perturbation theory and three-nucleon forces up to fifth order. Applications of some of these forces have been conducted in nuclear few- and many-body systems—with a certain degree of success. But in spite of these achievements, we are still faced with great challenges. Among them is the issue of a proper uncertainty quantification of predictions obtained when applying these forces in ab initio calculations of nuclear structure and reactions. A related problem is the order by order convergence of the chiral expansion. We start this review with a pedagogical introduction and then present the current status of the field of chiral nuclear forces. This is followed by a discussion of representative examples for the application of chiral two- and three-body forces in the nuclear many-body system including convergence issues.

  20. Study of vortex-assisted MSPD and LC-MS/MS using alternative solid supports for pharmaceutical extraction from marketed fish.

    PubMed

    Hertzog, Gabriel I; Soares, Karina L; Caldas, Sergiane S; Primel, Ednei G

    2015-06-01

    A procedure based on vortex-assisted matrix solid-phase dispersion (MSPD) for the extraction of 15 pharmaceuticals from fish samples with determination by liquid chromatography-tandem mass spectrometry (LC-MS/MS) was validated. Florisil, C18, diatomaceous earth, chitin, and chitosan were evaluated as solid supports. Best results were obtained with 0.5 g of diatomaceous earth, 0.5 g of sodium sulfate, and 5 mL of methanol. Analytical recoveries ranged from 58 to 128 % with relative standard deviation (RSD) lower than 15 %. Limit of quantification (LOQ) values for the 15 compounds ranged from 5 to 1000 ng g(-1). The method under investigation has shown to be a simple and fast extraction tool with minimum instrumentation and low amount of reagent, resulting in method low cost. Besides, alternative materials, such as chitin and chitosan, which were applied to the dispersion step for the first time, were found to be interesting alternatives.

  1. Causal Relationships Among Time Series of the Lange Bramke Catchment (Harz Mountains, Germany)

    NASA Astrophysics Data System (ADS)

    Aufgebauer, Britta; Hauhs, Michael; Bogner, Christina; Meesenburg, Henning; Lange, Holger

    2016-04-01

    Convergent Cross Mapping (CCM) has recently been introduced by Sugihara et al. for the identification and quantification of causal relationships among ecosystem variables. In particular, the method allows to decide on the direction of causality; in some cases, the causality might be bidirectional, indicating a network structure. We extend this approach by introducing a method of surrogate data to obtain confidence intervals for CCM results. We then apply this method to time series from stream water chemistry. Specifically, we analyze a set of eight dissolved major ions from three different catchments belonging to the hydrological monitoring system at the Bramke valley in the Harz Mountains, Germany. Our results demonstrate the potentials and limits of CCM as a monitoring instrument in forestry and hydrology or as a tool to identify processes in ecosystem research. While some networks of causally linked ions can be associated with simple physical and chemical processes, other results illustrate peculiarities of the three studied catchments, which are explained in the context of their special history.

  2. Precursor ion scan driven fast untargeted screening and semi-determination of caffeoylquinic acid derivatives in Cynara scolymus L.

    PubMed

    Shen, Qing; Lu, Yanbin; Dai, Zhiyuan; Cheung, Hon-Yeung

    2015-01-01

    A precursor ion scan (PIS) technique based strategy was developed for rapid screening and semi-determination of caffeoylquinic acid derivatives (CADs) in artichoke (Cynara scolymus L.) using ultra-performance liquid chromatography (UPLC) coupled with tandem mass spectrometry. 1,5-Dicaffeoylquinic acid and 5-caffeoylquinic acid were used for studying the fragmentation behaviour of two classes of CADs, setting m/z 191 as a diagnostic moiety. When it was applied to artichoke sample, ten CADs were detected and elucidated in a single PIS run. Furthermore, method validation was implemented including: specificity (no interference), linearity (≥0.9993), limit of detection (LOD<0.12 ng mL(-1)) and limit of quantification (LOQ<0.25 ng mL(-1)), precision (RSD≤3.6), recovery (91.4-95.9%) and stability (at least 12 h). This approach was proven to be a powerful, selective and sensitive tool for rapid screening and semi-determination of untargeted components in natural products. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Machine Detection of Enhanced Electromechanical Energy Conversion in PbZr 0.2Ti 0.8O 3 Thin Films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agar, Joshua C.; Cao, Ye; Naul, Brett

    Many energy conversion, sensing, and microelectronic applications based on ferroic materials are determined by the domain structure evolution under applied stimuli. New hyperspectral, multidimensional spectroscopic techniques now probe dynamic responses at relevant length and time scales to provide an understanding of how these nanoscale domain structures impact macroscopic properties. Such approaches, however, remain limited in use because of the difficulties that exist in extracting and visualizing scientific insights from these complex datasets. Using multidimensional band-excitation scanning probe spectroscopy and adapting tools from both computer vision and machine learning, an automated workflow is developed to featurize, detect, and classify signatures ofmore » ferroelectric/ferroelastic switching processes in complex ferroelectric domain structures. This approach enables the identification and nanoscale visualization of varied modes of response and a pathway to statistically meaningful quantification of the differences between those modes. Lastly, among other things, the importance of domain geometry is spatially visualized for enhancing nanoscale electromechanical energy conversion.« less

  4. JAva GUi for Applied Research (JAGUAR) v 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JAGUAR is a Java software tool for automatically rendering a graphical user interface (GUI) from a structured input specification. It is designed as a plug-in to the Eclipse workbench to enable users to create, edit, and externally execute analysis application input decks and then view the results. JAGUAR serves as a GUI for Sandia's DAKOTA software toolkit for optimization and uncertainty quantification. It will include problem (input deck)set-up, option specification, analysis execution, and results visualization. Through the use of wizards, templates, and views, JAGUAR helps uses navigate the complexity of DAKOTA's complete input specification. JAGUAR is implemented in Java, leveragingmore » Eclipse extension points and Eclipse user interface. JAGUAR parses a DAKOTA NIDR input specification and presents the user with linked graphical and plain text representations of problem set-up and option specification for DAKOTA studies. After the data has been input by the user, JAGUAR generates one or more input files for DAKOTA, executes DAKOTA, and captures and interprets the results« less

  5. Rapid identification and quantification of Campylobacter coli and Campylobacter jejuni by real-time PCR in pure cultures and in complex samples

    PubMed Central

    2011-01-01

    Background Campylobacter spp., especially Campylobacter jejuni (C. jejuni) and Campylobacter coli (C. coli), are recognized as the leading human foodborne pathogens in developed countries. Livestock animals carrying Campylobacter pose an important risk for human contamination. Pigs are known to be frequently colonized with Campylobacter, especially C. coli, and to excrete high numbers of this pathogen in their faeces. Molecular tools, notably real-time PCR, provide an effective, rapid, and sensitive alternative to culture-based methods for the detection of C. coli and C. jejuni in various substrates. In order to serve as a diagnostic tool supporting Campylobacter epidemiology, we developed a quantitative real-time PCR method for species-specific detection and quantification of C. coli and C. jejuni directly in faecal, feed, and environmental samples. Results With a sensitivity of 10 genome copies and a linear range of seven to eight orders of magnitude, the C. coli and C. jejuni real-time PCR assays allowed a precise quantification of purified DNA from C. coli and C. jejuni. The assays were highly specific and showed a 6-log-linear dynamic range of quantification with a quantitative detection limit of approximately 2.5 × 102 CFU/g of faeces, 1.3 × 102 CFU/g of feed, and 1.0 × 103 CFU/m2 for the environmental samples. Compared to the results obtained by culture, both C. coli and C. jejuni real-time PCR assays exhibited a specificity of 96.2% with a kappa of 0.94 and 0.89 respectively. For faecal samples of experimentally infected pigs, the coefficients of correlation between the C. coli or C. jejuni real-time PCR assay and culture enumeration were R2 = 0.90 and R2 = 0.93 respectively. Conclusion The C. coli and C. jejuni real-time quantitative PCR assays developed in this study provide a method capable of directly detecting and quantifying C. coli and C. jejuni in faeces, feed, and environmental samples. These assays represent a new diagnostic tool for studying the epidemiology of Campylobacter by, for instance, investigating the carriage and excretion of C. coli and C. jejuni by pigs from conventional herds. PMID:21600037

  6. Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT

    PubMed Central

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2014-01-01

    Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354

  7. Computer-aided assessment of regional abdominal fat with food residue removal in CT.

    PubMed

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2013-11-01

    Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.

  8. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: validation of the quantitative multimolecular approach by radiocarbon analysis.

    PubMed

    Jeanneau, Laurent; Faure, Pierre

    2010-09-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.

  9. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    PubMed

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  10. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  11. A new approach for the quantification of synchrony of multivariate non-stationary psychophysiological variables during emotion eliciting stimuli

    PubMed Central

    Kelava, Augustin; Muma, Michael; Deja, Marlene; Dagdagan, Jack Y.; Zoubir, Abdelhak M.

    2015-01-01

    Emotion eliciting situations are accompanied by changes of multiple variables associated with subjective, physiological and behavioral responses. The quantification of the overall simultaneous synchrony of psychophysiological reactions plays a major role in emotion theories and has received increased attention in recent years. From a psychometric perspective, the reactions represent multivariate non-stationary intra-individual time series. In this paper, a new time-frequency based latent variable approach for the quantification of the synchrony of the responses is presented. The approach is applied to empirical data, collected during an emotion eliciting situation. The results are compared with a complementary inter-individual approach of Hsieh et al. (2011). Finally, the proposed approach is discussed in the context of emotion theories, and possible future applications and limitations are provided. PMID:25653624

  12. Bayesian forecasting and uncertainty quantifying of stream flows using Metropolis-Hastings Markov Chain Monte Carlo algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen

    2017-06-01

    This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.

  13. A simple and efficient method for poly-3-hydroxybutyrate quantification in diazotrophic bacteria within 5 minutes using flow cytometry.

    PubMed

    Alves, L P S; Almeida, A T; Cruz, L M; Pedrosa, F O; de Souza, E M; Chubatsu, L S; Müller-Santos, M; Valdameri, G

    2017-01-16

    The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots.

  14. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  15. Large differences in land use emission quantifications implied by definition discrepancies

    NASA Astrophysics Data System (ADS)

    Stocker, B. D.; Joos, F.

    2015-03-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review the conceptual differences of eLUC quantification methods and apply an Earth System Model to demonstrate that what is claimed to represent total eLUC differs by up to ~20% when quantified from ESM vs. offline vegetation models. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies and global carbon budget accountings should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  16. Quantifying differences in land use emission estimates implied by definition discrepancies

    NASA Astrophysics Data System (ADS)

    Stocker, B. D.; Joos, F.

    2015-11-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review differences of eLUC quantification methods and apply an Earth System Model (ESM) of Intermediate Complexity to quantify them. We find that the magnitude of effects due to merely conceptual differences between ESM and offline vegetation model-based quantifications is ~ 20 % for today. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate secondary component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  17. Quantification of the xenoestrogens 4-tert.-octylphenol and bisphenol A in water and in fish tissue based on microwave assisted extraction, solid-phase extraction and liquid chromatography-mass spectrometry.

    PubMed

    Pedersen, S N; Lindholst, C

    1999-12-09

    Extraction methods were developed for quantification of the xenoestrogens 4-tert.-octylphenol (tOP) and bisphenol A (BPA) in water and in liver and muscle tissue from the rainbow trout (Oncorhynchus mykiss). The extraction of tOP and BPA from tissue samples was carried out using microwave-assisted solvent extraction (MASE) followed by solid-phase extraction (SPE). Water samples were extracted using only SPE. For the quantification of tOP and BPA, liquid chromatography mass spectrometry (LC-MS) equipped with an atmospheric pressure chemical ionisation interface (APCI) was applied. The combined methods for tissue extraction allow the use of small sample amounts of liver or muscle (typically 1 g), low volumes of solvent (20 ml), and short extraction times (25 min). Limits of quantification of tOP in tissue samples were found to be approximately 10 ng/g in muscle and 50 ng/g in liver (both based on 1 g of fresh tissue). The corresponding values for BPA were approximately 50 ng/g in both muscle and liver tissue. In water, the limit of quantification for tOP and BPA was approximately 0.1 microg/l (based on 100 ml sample size).

  18. A Taiwanese Mandarin Main Concept Analysis (TM-MCA) for Quantification of Aphasic Oral Discourse

    ERIC Educational Resources Information Center

    Kong, Anthony Pak-Hin; Yeh, Chun-Chih

    2015-01-01

    Background: Various quantitative systems have been proposed to examine aphasic oral narratives in English. A clinical tool for assessing discourse produced by Cantonese-speaking persons with aphasia (PWA), namely Main Concept Analysis (MCA), was developed recently for quantifying the presence, accuracy and completeness of a narrative. Similar…

  19. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    PubMed

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  20. Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.

    PubMed

    Becerra, Valentina; Odermatt, Jürgen

    2013-02-01

    This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less

  2. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Richard Yorg

    2011-03-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less

  3. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    PubMed Central

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-01-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets. PMID:27739510

  4. Quantification of the Water-Energy Nexus in Beijing City Based on Copula Analysis

    NASA Astrophysics Data System (ADS)

    Cai, J.; Cai, Y.

    2017-12-01

    Water resource and energy resource are intimately and highly interwoven, called ``water-energy nexus", which poses challenges for the sustainable management of water resource and energy resource. In this research, the Copula analysis method is first proposed to be applied in "water-energy nexus" field to clarify the internal relationship of water resource and energy resource, which is a favorable tool to explore the relevance among random variables. Beijing City, the capital of China, is chosen as a case study. The marginal distribution functions of water resource and energy resource are analyzed first. Then the Binary Copula function is employed to construct the joint distribution function of "water-energy nexus" to quantify the inherent relationship between water resource and energy resource. The results show that it is more appropriate to apply Lognormal distribution to establish the marginal distribution function of water resource. Meanwhile, Weibull distribution is more feasible to describe the marginal distribution function of energy resource. Furthermore, it is more suitable to adopt the Bivariate Normal Copula function to construct the joint distribution function of "water-energy nexus" in Beijing City. The findings can help to identify and quantify the "water-energy nexus". In addition, our findings can provide reasonable policy recommendations on the sustainable management of water resource and energy resource to promote regional coordinated development.

  5. Non-Euclidean phasor analysis for quantification of oxidative stress in ex vivo human skin exposed to sun filters using fluorescence lifetime imaging microscopy

    NASA Astrophysics Data System (ADS)

    Osseiran, Sam; Roider, Elisabeth M.; Wang, Hequn; Suita, Yusuke; Murphy, Michael; Fisher, David E.; Evans, Conor L.

    2017-12-01

    Chemical sun filters are commonly used as active ingredients in sunscreens due to their efficient absorption of ultraviolet (UV) radiation. Yet, it is known that these compounds can photochemically react with UV light and generate reactive oxygen species and oxidative stress in vitro, though this has yet to be validated in vivo. One label-free approach to probe oxidative stress is to measure and compare the relative endogenous fluorescence generated by cellular coenzymes nicotinamide adenine dinucleotides and flavin adenine dinucleotides. However, chemical sun filters are fluorescent, with emissive properties that contaminate endogenous fluorescent signals. To accurately distinguish the source of fluorescence in ex vivo skin samples treated with chemical sun filters, fluorescence lifetime imaging microscopy data were processed on a pixel-by-pixel basis using a non-Euclidean separation algorithm based on Mahalanobis distance and validated on simulated data. Applying this method, ex vivo samples exhibited a small oxidative shift when exposed to sun filters alone, though this shift was much smaller than that imparted by UV irradiation. Given the need for investigative tools to further study the clinical impact of chemical sun filters in patients, the reported methodology may be applied to visualize chemical sun filters and measure oxidative stress in patients' skin.

  6. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging

    PubMed Central

    Patel, Tapan P.; Man, Karen; Firestein, Bonnie L.; Meaney, David F.

    2017-01-01

    Background Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s–1000 +neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. New method Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. Results We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. Comparison with existing method(s) We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. Conclusions We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. PMID:25629800

  7. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  8. Simultaneous determination of PPCPs, EDCs, and artificial sweeteners in environmental water samples using a single-step SPE coupled with HPLC-MS/MS and isotope dilution.

    PubMed

    Tran, Ngoc Han; Hu, Jiangyong; Ong, Say Leong

    2013-09-15

    A high-throughput method for the simultaneous determination of 24 pharmaceuticals and personal care products (PPCPs), endocrine disrupting chemicals (EDCs) and artificial sweeteners (ASs) was developed. The method was based on a single-step solid phase extraction (SPE) coupled with high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) and isotope dilution. In this study, a single-step SPE procedure was optimized for simultaneous extraction of all target analytes. Good recoveries (≥ 70%) were observed for all target analytes when extraction was performed using Chromabond(®) HR-X (500 mg, 6 mL) cartridges under acidic condition (pH 2). HPLC-MS/MS parameters were optimized for the simultaneous analysis of 24 PPCPs, EDCs and ASs in a single injection. Quantification was performed by using 13 isotopically labeled internal standards (ILIS), which allows correcting efficiently the loss of the analytes during SPE procedure, matrix effects during HPLC-MS/MS and fluctuation in MS/MS signal intensity due to instrument. Method quantification limit (MQL) for most of the target analytes was below 10 ng/L in all water samples. The method was successfully applied for the simultaneous determination of PPCPs, EDCs and ASs in raw wastewater, surface water and groundwater samples collected in a local catchment area in Singapore. In conclusion, the developed method provided a valuable tool for investigating the occurrence, behavior, transport, and the fate of PPCPs, EDCs and ASs in the aquatic environment. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. A computerized MRI biomarker quantification scheme for a canine model of Duchenne muscular dystrophy.

    PubMed

    Wang, Jiahui; Fan, Zheng; Vandenborne, Krista; Walter, Glenn; Shiloh-Malawsky, Yael; An, Hongyu; Kornegay, Joe N; Styner, Martin A

    2013-09-01

    Golden retriever muscular dystrophy (GRMD) is a widely used canine model of Duchenne muscular dystrophy (DMD). Recent studies have shown that magnetic resonance imaging (MRI) can be used to non-invasively detect consistent changes in both DMD and GRMD. In this paper, we propose a semiautomated system to quantify MRI biomarkers of GRMD. Our system was applied to a database of 45 MRI scans from 8 normal and 10 GRMD dogs in a longitudinal natural history study. We first segmented six proximal pelvic limb muscles using a semiautomated full muscle segmentation method. We then performed preprocessing, including intensity inhomogeneity correction, spatial registration of different image sequences, intensity calibration of T2-weighted and T2-weighted fat-suppressed images, and calculation of MRI biomarker maps. Finally, for each of the segmented muscles, we automatically measured MRI biomarkers of muscle volume, intensity statistics over MRI biomarker maps, and statistical image texture features. The muscle volume and the mean intensities in T2 value, fat, and water maps showed group differences between normal and GRMD dogs. For the statistical texture biomarkers, both the histogram and run-length matrix features showed obvious group differences between normal and GRMD dogs. The full muscle segmentation showed significantly less error and variability in the proposed biomarkers when compared to the standard, limited muscle range segmentation. The experimental results demonstrated that this quantification tool could reliably quantify MRI biomarkers in GRMD dogs, suggesting that it would also be useful for quantifying disease progression and measuring therapeutic effect in DMD patients.

  10. Unsupervised quantification of abdominal fat from CT images using Greedy Snakes

    NASA Astrophysics Data System (ADS)

    Agarwal, Chirag; Dallal, Ahmed H.; Arbabshirani, Mohammad R.; Patel, Aalpen; Moore, Gregory

    2017-02-01

    Adipose tissue has been associated with adverse consequences of obesity. Total adipose tissue (TAT) is divided into subcutaneous adipose tissue (SAT) and visceral adipose tissue (VAT). Intra-abdominal fat (VAT), located inside the abdominal cavity, is a major factor for the classic obesity related pathologies. Since direct measurement of visceral and subcutaneous fat is not trivial, substitute metrics like waist circumference (WC) and body mass index (BMI) are used in clinical settings to quantify obesity. Abdominal fat can be assessed effectively using CT or MRI, but manual fat segmentation is rather subjective and time-consuming. Hence, an automatic and accurate quantification tool for abdominal fat is needed. The goal of this study is to extract TAT, VAT and SAT fat from abdominal CT in a fully automated unsupervised fashion using energy minimization techniques. We applied a four step framework consisting of 1) initial body contour estimation, 2) approximation of the body contour, 3) estimation of inner abdominal contour using Greedy Snakes algorithm, and 4) voting, to segment the subcutaneous and visceral fat. We validated our algorithm on 952 clinical abdominal CT images (from 476 patients with a very wide BMI range) collected from various radiology departments of Geisinger Health System. To our knowledge, this is the first study of its kind on such a large and diverse clinical dataset. Our algorithm obtained a 3.4% error for VAT segmentation compared to manual segmentation. These personalized and accurate measurements of fat can complement traditional population health driven obesity metrics such as BMI and WC.

  11. 3D visualization and quantification of bone and teeth mineralization for the study of osteo/dentinogenesis in mice models

    NASA Astrophysics Data System (ADS)

    Marchadier, A.; Vidal, C.; Ordureau, S.; Lédée, R.; Léger, C.; Young, M.; Goldberg, M.

    2011-03-01

    Research on bone and teeth mineralization in animal models is critical for understanding human pathologies. Genetically modified mice represent highly valuable models for the study of osteo/dentinogenesis defects and osteoporosis. Current investigations on mice dental and skeletal phenotype use destructive and time consuming methods such as histology and scanning microscopy. Micro-CT imaging is quicker and provides high resolution qualitative phenotypic description. However reliable quantification of mineralization processes in mouse bone and teeth are still lacking. We have established novel CT imaging-based software for accurate qualitative and quantitative analysis of mouse mandibular bone and molars. Data were obtained from mandibles of mice lacking the Fibromodulin gene which is involved in mineralization processes. Mandibles were imaged with a micro-CT originally devoted to industrial applications (Viscom, X8060 NDT). 3D advanced visualization was performed using the VoxBox software (UsefulProgress) with ray casting algorithms. Comparison between control and defective mice mandibles was made by applying the same transfer function for each 3D data, thus allowing to detect shape, colour and density discrepencies. The 2D images of transverse slices of mandible and teeth were similar and even more accurate than those obtained with scanning electron microscopy. Image processing of the molars allowed the 3D reconstruction of the pulp chamber, providing a unique tool for the quantitative evaluation of dentinogenesis. This new method is highly powerful for the study of oro-facial mineralizations defects in mice models, complementary and even competitive to current histological and scanning microscopy appoaches.

  12. The why and how of amino acid analytics in cancer diagnostics and therapy.

    PubMed

    Manig, Friederike; Kuhne, Konstantin; von Neubeck, Cläre; Schwarzenbolz, Uwe; Yu, Zhanru; Kessler, Benedikt M; Pietzsch, Jens; Kunz-Schughart, Leoni A

    2017-01-20

    Pathological alterations in cell functions are frequently accompanied by metabolic reprogramming including modifications in amino acid metabolism. Amino acid detection is thus integral to the diagnosis of many hereditary metabolic diseases. The development of malignant diseases as metabolic disorders comes along with a complex dysregulation of genetic and epigenetic factors affecting metabolic enzymes. Cancer cells might transiently or permanently become auxotrophic for non-essential or semi-essential amino acids such as asparagine or arginine. Also, transformed cells are often more susceptible to local shortage of essential amino acids such as methionine than normal tissues. This offers new points of attacking unique metabolic features in cancer cells. To better understand these processes, highly sensitive methods for amino acid detection and quantification are required. Our review summarizes the main methodologies for amino acid detection with a particular focus on applications in biomedicine and cancer, provides a historical overview of the methodological pre-requisites in amino acid analytics. We compare classical and modern approaches such as the combination of gas chromatography and liquid chromatography with mass spectrometry (GC-MS/LC-MS). The latter is increasingly applied in clinical routine. We therefore illustrate an LC-MS workflow for analyzing arginine and methionine as well as their precursors and analogs in biological material. Pitfalls during protocol development are discussed, but LC-MS emerges as a reliable and sensitive tool for the detection of amino acids in biological matrices. Quantification is challenging, but of particular interest in cancer research as targeting arginine and methionine turnover in cancer cells represent novel treatment strategies. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Rapid amperometric detection of Escherichia coli in wastewater by measuring β-D glucuronidase activity with disposable carbon sensors.

    PubMed

    Rochelet, Murielle; Solanas, Sébastien; Betelli, Laetitia; Chantemesse, Benoît; Vienney, Fabienne; Hartmann, Alain

    2015-09-10

    An assay on the indirect amperometric quantification of the β-D-Glucuronidase (GLUase) activity was developed for the rapid and specific detection of Escherichia coli (E. coli) in complex environmental samples. The p-aminophenyl β-D-glucopyranoside (PAPG) was selected as an electrochemical substrate for GLUase measurement and the p-aminophenol (PAP) released during the enzymatic hydrolysis was monitored by cyclic voltammetry with disposable carbon screen-printed sensors. The intensity of the measured anodic peak current was proportional to the amount of GLUase, and therefore to the number of E. coli in the tested sample. Once the substrate concentration and pH values optimized, a GLUase detection limit of 10 ng mL(-1) was achieved. Using a procedure involving a filtration step of the bacteria followed by their incubation with the substrate solution containing both the nonionic detergent Triton X-100 as permeabilization agent and the culture media Luria broth to monitor the growth, filtered bacterial cells ranging from 5 × 10(4) to 10(8) UFC/membrane were detected within 3 h. The amperometric assay was applied to the determination of fecal contamination in raw and treated wastewater samples and it was successfully compared with conventional bacterial plating methods and uidA gene quantitative PCR. Owing to its ability to perform measurements in turbid media, the GLUase amperometric method is a reliable tool for the rapid and decentralized quantification of viable but also nonculturable E. coli in complex environmental samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Effect of ultrasound on lactic acid production by Lactobacillus strains in date (Phoenix dactylifera var. Kabkab) syrup.

    PubMed

    Hashemi, Seyed Mohammad Bagher; Mousavi Khaneghah, Amin; Saraiva, Jorge A; Jambrak, Anet Režek; Barba, Francisco J; Mota, Maria J

    2018-03-01

    Date syrup is rich in fermentable sugars and may be used as a substrate for different microbial fermentations, including lactic acid fermentation processes. The beneficial effects of ultrasounds (US) on bioprocesses have been reported for several microorganisms, due to the enhancement of cell growth, as well as improvements in yields and productivities. Therefore, US treatments (30 kHz, 100 W, 10-30 min) were applied to two lactobacilli (Lactobacillus helveticus PTCC 1332 and Lactobacillus acidophilus PTCC 1643), during fermentation using date syrup as substrate. The effects on lactic acid fermentation were evaluated by analyzing cell growth (dry cell weight and viable cell count), substrate consumption (quantification of glucose and fructose), and product formation (quantification of lactic acid) over time. The effects of US were also evaluated on cell membrane permeability. Both lactobacilli were able to grow well on date syrup without the need for addition of further ingredients. The US effects were highly dependent on treatment duration: treatments of 10- and 20-min stimulated lactobacilli growth, while the treatment extension to 30 min negatively affected cell growth. Similarly, the 10- and 20-min treatments increased sugar consumption and lactic acid production, contrarily to the 30-min treatment. All US treatments increased cell membrane permeability, with a more pronounced effect at more extended treatments. The results of this work showed that application of appropriate US treatments could be a useful tool for stimulation of lactic acid production from date syrup, as well as for other fermentative processes that use date syrup as substrate.

  15. Extending the Range for Force Calibration in Magnetic Tweezers

    PubMed Central

    Daldrop, Peter; Brutzer, Hergen; Huhle, Alexander; Kauert, Dominik J.; Seidel, Ralf

    2015-01-01

    Magnetic tweezers are a wide-spread tool used to study the mechanics and the function of a large variety of biomolecules and biomolecular machines. This tool uses a magnetic particle and a strong magnetic field gradient to apply defined forces to the molecule of interest. Forces are typically quantified by analyzing the lateral fluctuations of the biomolecule-tethered particle in the direction perpendicular to the applied force. Since the magnetic field pins the anisotropy axis of the particle, the lateral fluctuations follow the geometry of a pendulum with a short pendulum length along and a long pendulum length perpendicular to the field lines. Typically, the short pendulum geometry is used for force calibration by power-spectral-density (PSD) analysis, because the movement of the bead in this direction can be approximated by a simple translational motion. Here, we provide a detailed analysis of the fluctuations according to the long pendulum geometry and show that for this direction, both the translational and the rotational motions of the particle have to be considered. We provide analytical formulas for the PSD of this coupled system that agree well with PSDs obtained in experiments and simulations and that finally allow a faithful quantification of the magnetic force for the long pendulum geometry. We furthermore demonstrate that this methodology allows the calibration of much larger forces than the short pendulum geometry in a tether-length-dependent manner. In addition, the accuracy of determination of the absolute force is improved. Our force calibration based on the long pendulum geometry will facilitate high-resolution magnetic-tweezers experiments that rely on short molecules and large forces, as well as highly parallelized measurements that use low frame rates. PMID:25992733

  16. A Standardised Vocabulary for Identifying Benthic Biota and Substrata from Underwater Imagery: The CATAMI Classification Scheme

    PubMed Central

    Jordan, Alan; Rees, Tony; Gowlett-Holmes, Karen

    2015-01-01

    Imagery collected by still and video cameras is an increasingly important tool for minimal impact, repeatable observations in the marine environment. Data generated from imagery includes identification, annotation and quantification of biological subjects and environmental features within an image. To be long-lived and useful beyond their project-specific initial purpose, and to maximize their utility across studies and disciplines, marine imagery data should use a standardised vocabulary of defined terms. This would enable the compilation of regional, national and/or global data sets from multiple sources, contributing to broad-scale management studies and development of automated annotation algorithms. The classification scheme developed under the Collaborative and Automated Tools for Analysis of Marine Imagery (CATAMI) project provides such a vocabulary. The CATAMI classification scheme introduces Australian-wide acknowledged, standardised terminology for annotating benthic substrates and biota in marine imagery. It combines coarse-level taxonomy and morphology, and is a flexible, hierarchical classification that bridges the gap between habitat/biotope characterisation and taxonomy, acknowledging limitations when describing biological taxa through imagery. It is fully described, documented, and maintained through curated online databases, and can be applied across benthic image collection methods, annotation platforms and scoring methods. Following release in 2013, the CATAMI classification scheme was taken up by a wide variety of users, including government, academia and industry. This rapid acceptance highlights the scheme’s utility and the potential to facilitate broad-scale multidisciplinary studies of marine ecosystems when applied globally. Here we present the CATAMI classification scheme, describe its conception and features, and discuss its utility and the opportunities as well as challenges arising from its use. PMID:26509918

  17. Spot quantification in two dimensional gel electrophoresis image analysis: comparison of different approaches and presentation of a novel compound fitting algorithm

    PubMed Central

    2014-01-01

    Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860

  18. Characterization and quantification of grape variety by means of shikimic acid concentration and protein fingerprint in still white wines.

    PubMed

    Chabreyrie, David; Chauvet, Serge; Guyon, François; Salagoïty, Marie-Hélène; Antinelli, Jean-François; Medina, Bernard

    2008-08-27

    Protein profiles, obtained by high-performance capillary electrophoresis (HPCE) on white wines previously dialyzed, combined with shikimic acid concentration and multivariate analysis, were used for the determination of grape variety composition of a still white wine. Six varieties were studied through monovarietal wines elaborated in the laboratory: Chardonnay (24 samples), Chenin (24), Petit Manseng (7), Sauvignon (37), Semillon (24), and Ugni Blanc (9). Homemade mixtures were elaborated from authentic monovarietal wines according to a Plackett-Burman sampling plan. After protein peak area normalization, a matrix was elaborated containing protein results of wines (mixtures and monovarietal). Partial least-squares processing was applied to this matrix allowing the elaboration of a model that provided a varietal quantification precision of around 20% for most of the grape varieties studied. The model was applied to commercial samples from various geographical origins, providing encouraging results for control purposes.

  19. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Crevillén-García, D.; Power, H.

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  20. Multiplexed Liquid Chromatography-Multiple Reaction Monitoring Mass Spectrometry Quantification of Cancer Signaling Proteins

    PubMed Central

    Chen, Yi; Fisher, Kate J.; Lloyd, Mark; Wood, Elizabeth R.; Coppola, Domenico; Siegel, Erin; Shibata, David; Chen, Yian A.; Koomen, John M.

    2017-01-01

    Quantitative evaluation of protein expression across multiple cancer-related signaling pathways (e.g. Wnt/β-catenin, TGF-β, receptor tyrosine kinases (RTK), MAP kinases, NF-κB, and apoptosis) in tumor tissues may enable the development of a molecular profile for each individual tumor that can aid in the selection of appropriate targeted cancer therapies. Here, we describe the development of a broadly applicable protocol to develop and implement quantitative mass spectrometry assays using cell line models and frozen tissue specimens from colon cancer patients. Cell lines are used to develop peptide-based assays for protein quantification, which are incorporated into a method based on SDS-PAGE protein fractionation, in-gel digestion, and liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM/MS). This analytical platform is then applied to frozen tumor tissues. This protocol can be broadly applied to the study of human disease using multiplexed LC-MRM assays. PMID:28808993

  1. Quantitative determination of clopidogrel and its metabolites in biological samples: a mini-review.

    PubMed

    Elsinghorst, Paul W

    2013-02-15

    Clopidogrel has been applied in antiplatelet therapy since 1998 and is the thienopyridine with the largest clinical experience. By 2011, clopidogrel (Plavix(®)) was the second top-selling drug in the world. Following complete patent expiry in 2012/2013 its use is expected to grow even further from generics entering the market. Prefaced by a brief description of clopidogrel metabolism, this review analyzes analytical methods addressing the quantification of clopidogrel and its metabolites in biological samples. Techniques that have been applied to analyze human plasma or serum are predominantly LC-MS and LC-MS/MS. The lowest level of clopidogrel quantification that has been achieved is 5pg/mL, the shortest runtime is 1.5min and almost 100% recovery has been reported using solid-phase extraction for sample preparation. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  3. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media.

    PubMed

    Crevillén-García, D; Power, H

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  4. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    PubMed Central

    Power, H.

    2017-01-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen–Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error. PMID:28878974

  5. Application of information-theoretic measures to quantitative analysis of immunofluorescent microscope imaging.

    PubMed

    Shutin, Dmitriy; Zlobinskaya, Olga

    2010-02-01

    The goal of this contribution is to apply model-based information-theoretic measures to the quantification of relative differences between immunofluorescent signals. Several models for approximating the empirical fluorescence intensity distributions are considered, namely Gaussian, Gamma, Beta, and kernel densities. As a distance measure the Hellinger distance and the Kullback-Leibler divergence are considered. For the Gaussian, Gamma, and Beta models the closed-form expressions for evaluating the distance as a function of the model parameters are obtained. The advantages of the proposed quantification framework as compared to simple mean-based approaches are analyzed with numerical simulations. Two biological experiments are also considered. The first is the functional analysis of the p8 subunit of the TFIIH complex responsible for a rare hereditary multi-system disorder--trichothiodystrophy group A (TTD-A). In the second experiment the proposed methods are applied to assess the UV-induced DNA lesion repair rate. A good agreement between our in vivo results and those obtained with an alternative in vitro measurement is established. We believe that the computational simplicity and the effectiveness of the proposed quantification procedure will make it very attractive for different analysis tasks in functional proteomics, as well as in high-content screening. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  6. A new insert sample approach to paper spray mass spectrometry: a paper substrate with paraffin barriers.

    PubMed

    Colletes, T C; Garcia, P T; Campanha, R B; Abdelnur, P V; Romão, W; Coltro, W K T; Vaz, B G

    2016-03-07

    The analytical performance for paper spray (PS) using a new insert sample approach based on paper with paraffin barriers (PS-PB) is presented. The paraffin barrier is made using a simple, fast and cheap method based on the stamping of paraffin onto a paper surface. Typical operation conditions of paper spray such as the solvent volume applied on the paper surface, and the paper substrate type are evaluated. A paper substrate with paraffin barriers shows better performance on analysis of a range of typical analytes when compared to the conventional PS-MS using normal paper (PS-NP) and PS-MS using paper with two rounded corners (PS-RC). PS-PB was applied to detect sugars and their inhibitors in sugarcane bagasse liquors from a second generation ethanol process. Moreover, the PS-PB proved to be excellent, showing results for the quantification of glucose in hydrolysis liquors with excellent linearity (R(2) = 0.99), limits of detection (2.77 mmol L(-1)) and quantification (9.27 mmol L(-1)). The results are better than for PS-NP and PS-RC. The PS-PB was also excellent in performance when compared with the HPLC-UV method for glucose quantification on hydrolysis of liquor samples.

  7. Quantification of Short-Chain Chlorinated Paraffins by Deuterodechlorination Combined with Gas Chromatography-Mass Spectrometry.

    PubMed

    Gao, Yuan; Zhang, Haijun; Zou, Lili; Wu, Ping; Yu, Zhengkun; Lu, Xianbo; Chen, Jiping

    2016-04-05

    Analysis of short-chain chlorinated paraffins (SCCPs) is extremely difficult because of their complex compositions with thousands of isomers and homologues. A novel analytical method, deuterodechlorination combined with high resolution gas chromatography-high resolution mass spectrometry (HRGC-HRMS), was developed. A protocol is applied in the deuterodechlorination of SCCPs with LiAlD4, and the formed deuterated n-alkanes of different alkane chains can be distinguished readily from each other on the basis of their retention time and fragment mass ([M](+)) by HRGC-HRMS. An internal standard quantification of individual SCCP congeners was achieved, in which branched C10-CPs and branched C12-CPs were used as the extraction and reaction internal standards, respectively. A maximum factor of 1.26 of the target SCCP concentrations were determined by this method, and the relative standard deviations for quantification of total SCCPs were within 10%. This method was applied to determine the congener compositions of SCCPs in commercial chlorinated paraffins and environmental and biota samples after method validation. Low-chlorinated SCCP congeners (Cl1-4) were found to account for 32.4%-62.4% of the total SCCPs. The present method provides an attractive perspective for further studies on the toxicological and environmental characteristics of SCCPs.

  8. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  9. Mass spectrometry as a quantitative tool in plant metabolomics

    PubMed Central

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  10. Detection and monitoring of volatile and semivolatile pollutants in soil through different sensing strategies

    NASA Astrophysics Data System (ADS)

    De Cesare, Fabrizio; Macagnano, Antonella

    2013-04-01

    Pollutants in environments are more and more threatening the maintenance of health of habitats and their inhabitants. A proper evaluation of the impact of contaminants from several different potential sources on soil quality and health and then on organisms living therein, and the possible and sometime probable related risk of transfer of pollutants, with their toxic effects, to organisms living in different environmental compartments, through the trophic chain up to humans is strongly required by decision makers, in order to promptly take adequate actions to prevent environmental and health damages and monitor the exposure rate of individuals to toxicants. Then, a reliable detection of pollutants in environments and the monitoring of dynamics and fate of contaminants therein are of utmost importance to achieve this goal. In soil, chemical and physical techniques to detect pollutants have been well known for decades, but can often drive to both over- and underestimations of the actual bioavailable (and then toxic) fraction of contaminants, and then of the real risk for organisms, deriving from their presence therein. The use of bioindicators (both living organisms and enzyme activities somehow derived from them) can supply more reliable information about the quantification of the bioavailable fraction of soil pollutants. In the last decades, a physicochemical technique, such as SPME (solid phase microextraction) followed by GC-MS analysis, has been demonstrated to provide similar results to those obtained from some pedofaunal populations, used as bioindicators, as concerns the bioavailable pollutant quantification in soil. More recently, we have applied a sensing technology, namely electronic nose (EN), which comprises several unspecific sensors arranged in an array and that is capable of providing more qualitative than quantitative information about complex air samples, to the study of soils contaminated with semivolatile (SVOCs) pollutants, such as polycyclic aromatic hydrocarbons (PAHs). The EN device set up on purpose involved suitable sensors and it was demonstrated to be capable of supplying information related to the whole soil environment as well as to the presence of contaminants and their dynamics, such as their biodegradation by soil microorganisms and the contemporary increase of CO2 release. These results were also somehow related to those obtained through SPME-GC/MS analyses, since a list of substances could be identified to be responsible for the different classification of contaminated and uncontaminated soil samples obtained through EN. Presently, we also have got evidences that more complex sensing devices can be used for in situ monitoring of contaminated soils. We have designed and fabricated a multi-parametric hybrid sensing system, based on the assembly of several different sensors and sensing systems (i.e. single sensors and a sensor array), some of which are commercially available, while some others were created by design in laboratory and tested for their specificity. The main target of such a hybrid sensing device was to be capable of measuring various soil parameters and volatile pollutants (VOCs) in soil, such as BTEX (benzene, toluene, ethylbenzene and xylene), in order to relate the quantification and behaviour of contaminants in soil (e.g. solubility, volatility, phase partitioning, adsorption and desorption, etc.) to the relative environmental conditions, by measuring physical (temperature and moisture) and chemical (pH) parameters, which can affect such processes. Furthermore, a suitable procedure was set up on purpose to provide VOCs quantifications actually related to the bioavailable fraction of pollutants (passive vs. active sampling). That sensing system was also set up for a wireless communication of the recorded values to a data-collecting centre. Such a tool was designed to be used as a proper probe to insert into soil for in situ monitoring of contaminated sites in order to provide semi-continuous information about soil pollution conditions and evolutions, suitable for unskilled employees, on the basis of three different levels of contaminations and alarms. That probe might be then a suitable tool for decision makers about environmental risk assessment. Finally, an EN device has also been recently applied to detect microbial activity and biomass in soil. Then, the described sensing strategies might be successfully used to both monitor the presence of pollutants and their dynamics during and after remediation processes, in order to validate the effectiveness of the specific techniques applied in contaminated sites, and evaluate the recovery of soil metabolic activities and active microbial biomass.

  11. Development of magnetic resonance technology for noninvasive boron quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.

  12. Quantification of choroidal neovascularization vessel length using optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Gao, Simon S.; Liu, Li; Bailey, Steven T.; Flaxel, Christina J.; Huang, David; Li, Dengwang; Jia, Yali

    2016-07-01

    Quantification of choroidal neovascularization (CNV) as visualized by optical coherence tomography angiography (OCTA) may have importance clinically when diagnosing or tracking disease. Here, we present an automated algorithm to quantify the vessel skeleton of CNV as vessel length. Initial segmentation of the CNV on en face angiograms was achieved using saliency-based detection and thresholding. A level set method was then used to refine vessel edges. Finally, a skeleton algorithm was applied to identify vessel centerlines. The algorithm was tested on nine OCTA scans from participants with CNV and comparisons of the algorithm's output to manual delineation showed good agreement.

  13. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    NASA Astrophysics Data System (ADS)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  14. Recognition and Quantification of Area Damaged by Oligonychus Perseae in Avocado Leaves

    NASA Astrophysics Data System (ADS)

    Díaz, Gloria; Romero, Eduardo; Boyero, Juan R.; Malpica, Norberto

    The measure of leaf damage is a basic tool in plant epidemiology research. Measuring the area of a great number of leaves is subjective and time consuming. We investigate the use of machine learning approaches for the objective segmentation and quantification of leaf area damaged by mites in avocado leaves. After extraction of the leaf veins, pixels are labeled with a look-up table generated using a Support Vector Machine with a polynomial kernel of degree 3, on the chrominance components of YCrCb color space. Spatial information is included in the segmentation process by rating the degree of membership to a certain class and the homogeneity of the classified region. Results are presented on real images with different degrees of damage.

  15. PET/MRI for neurologic applications.

    PubMed

    Catana, Ciprian; Drzezga, Alexander; Heiss, Wolf-Dieter; Rosen, Bruce R

    2012-12-01

    PET and MRI provide complementary information in the study of the human brain. Simultaneous PET/MRI data acquisition allows the spatial and temporal correlation of the measured signals, creating opportunities impossible to realize using stand-alone instruments. This paper reviews the methodologic improvements and potential neurologic and psychiatric applications of this novel technology. We first present methods for improving the performance and information content of each modality by using the information provided by the other technique. On the PET side, we discuss methods that use the simultaneously acquired MRI data to improve the PET data quantification. On the MRI side, we present how improved PET quantification can be used to validate several MRI techniques. Finally, we describe promising research, translational, and clinical applications that can benefit from these advanced tools.

  16. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  17. Application of Quality by Design Approach to Bioanalysis: Development of a Method for Elvitegravir Quantification in Human Plasma.

    PubMed

    Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo

    2017-10-01

    The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.

  18. A simple, rapid and sensitive RP-HPLC-UV method for the simultaneous determination of sorafenib & paclitaxel in plasma and pharmaceutical dosage forms: Application to pharmacokinetic study.

    PubMed

    Khan, Ismail; Iqbal, Zafar; Khan, Abad; Hassan, Muhammad; Nasir, Fazle; Raza, Abida; Ahmad, Lateef; Khan, Amjad; Akhlaq Mughal, Muhammad

    2016-10-15

    A simple, economical, fast, and sensitive RP-HPLC-UV method has been developed for the simultaneous quantification of Sorafenib and paclitaxel in biological samples and formulations using piroxicam as an internal standard. The experimental conditions were optimized and method was validated according to the standard guidelines. The separation of both the analytes and internal standard was achieved on Discovery HS C18 column (250mm×4.6mm, 5μm) using Acetonitrile and TFA (0.025%) in the ratio of (65:35V/V) as the mobile phase in isocratic mode at a flow rate of 1ml/min, with a wavelength of 245nm and at a column oven temperature of 25°Cin a short run time of 12min. The limits of detection (LLOD) were 5 and 10ng/ml while the limits of quantification (LLOQ) were 10 and 15ng/ml for sorafenib and paclitaxel, respectively. Sorafenib, paclitaxel and piroxicam (IS) were extracted from biological samples by applying acetonitrile as a precipitating and extraction solvent. The method is linear in the range of 15-20,000ng/ml for paclitaxel and 10-5000ng/ml for sorafenib, respectively. The method is sensitive and reliable by considering both of its intra-day and inter-day co-efficient of variance. The method was successfully applied for the quantification of the above mentioned drugs in plasma. The developed method will be applied towards sorafenib and paclitaxel pharmacokinetics studies in animal models. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Effects of Frequency Drift on the Quantification of Gamma-Aminobutyric Acid Using MEGA-PRESS

    PubMed Central

    Tsai, Shang-Yueh; Fang, Chun-Hao; Wu, Thai-Yu; Lin, Yi-Ru

    2016-01-01

    The MEGA-PRESS method is the most common method used to measure γ-aminobutyric acid (GABA) in the brain at 3T. It has been shown that the underestimation of the GABA signal due to B0 drift up to 1.22 Hz/min can be reduced by post-frequency alignment. In this study, we show that the underestimation of GABA can still occur even with post frequency alignment when the B0 drift is up to 3.93 Hz/min. The underestimation can be reduced by applying a frequency shift threshold. A total of 23 subjects were scanned twice to assess the short-term reproducibility, and 14 of them were scanned again after 2–8 weeks to evaluate the long-term reproducibility. A linear regression analysis of the quantified GABA versus the frequency shift showed a negative correlation (P < 0.01). Underestimation of the GABA signal was found. When a frequency shift threshold of 0.125 ppm (15.5 Hz or 1.79 Hz/min) was applied, the linear regression showed no statistically significant difference (P > 0.05). Therefore, a frequency shift threshold at 0.125 ppm (15.5 Hz) can be used to reduce underestimation during GABA quantification. For data with a B0 drift up to 3.93 Hz/min, the coefficients of variance of short-term and long-term reproducibility for the GABA quantification were less than 10% when the frequency threshold was applied. PMID:27079873

  20. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    PubMed

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  1. Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.

    PubMed

    de Wit, C; Fautz, C; Xu, Y

    2000-09-01

    Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell-based production system. Copyright 2000 The International Association for Biologicals.

  2. Bayesian forecasting and uncertainty quantifying of stream flows using Metropolis–Hastings Markov Chain Monte Carlo algorithm

    DOE PAGES

    Wang, Hongrui; Wang, Cheng; Wang, Ying; ...

    2017-04-05

    This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less

  3. Evaluation of the potential use of hybrid LC-MS/MS for active drug quantification applying the 'free analyte QC concept'.

    PubMed

    Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F

    2017-11-01

    Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.

  4. Microwave-assisted extraction of green coffee oil and quantification of diterpenes by HPLC.

    PubMed

    Tsukui, A; Santos Júnior, H M; Oigman, S S; de Souza, R O M A; Bizzo, H R; Rezende, C M

    2014-12-01

    The microwave-assisted extraction (MAE) of 13 different green coffee beans (Coffea arabica L.) was compared to Soxhlet extraction for oil obtention. The full factorial design applied to the microwave-assisted extraction (MAE), related to time and temperature parameters, allowed to develop a powerful fast and smooth methodology (10 min at 45°C) compared to a 4h Soxhlet extraction. The quantification of cafestol and kahweol diterpenes present in the coffee oil was monitored by HPLC/UV and showed satisfactory linearity (R(2)=0.9979), precision (CV 3.7%), recovery (<93%), limit of detection (0.0130 mg/mL), and limit of quantification (0.0406 mg/mL). The space-time yield calculated on the diterpenes content for sample AT1 (Arabica green coffee) showed a six times higher value compared to the traditional Soxhlet method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Quantification of N-acetyl- and N-glycolylneuraminic acids by a stable isotope dilution assay using high-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Allevi, Pietro; Femia, Eti Alessandra; Costa, Maria Letizia; Cazzola, Roberta; Anastasia, Mario

    2008-11-28

    The present report describes a method for the quantification of N-acetyl- and N-glycolylneuraminic acids without any derivatization, using their (13)C(3)-isotopologues as internal standards and a C(18) reversed-phase column modified by decylboronic acid which allows for the first time a complete chromatographic separation between the two analytes. The method is based on high-performance liquid chromatographic coupled with electrospray ion-trap mass spectrometry. The limit of quantification of the method is 0.1mg/L (2.0ng on column) for both analytes. The calibration curves are linear for both sialic acids over the range of 0.1-80mg/L (2.0-1600ng on column) with a correlation coefficient greater than 0.997. The proposed method was applied to the quantitative determination of sialic acids released from fetuin as a model of glycoproteins.

  6. A simple and efficient method for poly-3-hydroxybutyrate quantification in diazotrophic bacteria within 5 minutes using flow cytometry

    PubMed Central

    Alves, L.P.S.; Almeida, A.T.; Cruz, L.M.; Pedrosa, F.O.; de Souza, E.M.; Chubatsu, L.S.; Müller-Santos, M.; Valdameri, G.

    2017-01-01

    The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots. PMID:28099582

  7. Ferromagnetic resonance for the quantification of superparamagnetic iron oxide nanoparticles in biological materials

    PubMed Central

    Gamarra, Lionel F; daCosta-Filho, Antonio J; Mamani, Javier B; de Cassia Ruiz, Rita; Pavon, Lorena F; Sibov, Tatiana T; Vieira, Ernanni D; Silva, André C; Pontuschka, Walter M; Amaro, Edson

    2010-01-01

    The aim of the present work is the presentation of a quantification methodology for the control of the amount of superparamagnetic iron oxide nanoparticles (SPIONs) administered in biological materials by means of the ferromagnetic resonance technique (FMR) applied to studies both in vivo and in vitro. The in vivo study consisted in the analysis of the elimination and biodistribution kinetics of SPIONs after intravenous administration in Wistar rats. The results were corroborated by X-ray fluorescence. For the in vitro study, a quantitative analysis of the concentration of SPIONs bound to the specific AC133 monoclonal antibodies was carried out in order to detect the expression of the antigenic epitopes (CD133) in stem cells from human umbilical cord blood. In both studies FMR has proven to be an efficient technique for the SPIONs quantification per volume unit (in vivo) or per labeled cell (in vitro). PMID:20463936

  8. LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.

    PubMed

    Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei

    2012-12-01

    Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Quantification of Operational Risk Using A Data Mining

    NASA Technical Reports Server (NTRS)

    Perera, J. Sebastian

    1999-01-01

    What is Data Mining? - Data Mining is the process of finding actionable information hidden in raw data. - Data Mining helps find hidden patterns, trends, and important relationships often buried in a sea of data - Typically, automated software tools based on advanced statistical analysis and data modeling technology can be utilized to automate the data mining process

  10. Quantification of betaglucans, lipid and protein contents in whole oat groats (Avena sativa L.) using near infrared reflectance spectroscopy

    USDA-ARS?s Scientific Manuscript database

    Whole oat has been described as an important healthy food for humans due to its beneficial nutritional components. Near infrared reflectance spectroscopy (NIRS) is a powerful, fast, accurate and non-destructive analytical tool that can be substituted for some traditional chemical analysis. A total o...

  11. Ecosystem services in changing landscapes: An introduction

    Treesearch

    Louis Iverson; Cristian Echeverria; Laura Nahuelhual; Sandra Luque

    2014-01-01

    The concept of ecosystem services from landscapes is rapidly gaining momentum as a language to communicate values and benefits to scientists and lay alike. Landscape ecology has an enormous contribution to make to this field, and one could argue, uniquely so. Tools developed or adapted for landscape ecology are being increasingly used to assist with the quantification...

  12. Understanding the transmission dynamics of Leishmania donovani to provide robust evidence for interventions to eliminate visceral leishmaniasis in Bihar, India

    USDA-ARS?s Scientific Manuscript database

    Molecular tools enable the collection of accurate estimates of human blood index (HBI) in Phlebotomus argentipes. The refinement of a metacyclic-specific qPCR assay to identify L. donovani in P. argentipes would enable quantification of the entomological inoculation rate (EIR) for the first time. Li...

  13. Ecosystem evapotranspiration: challenges in measurements, estimates, and modeling

    Treesearch

    Devendra Amatya; S. Irmak; P. Gowda; Ge Sun; J.E. Nettles; K.R. Douglas-Mankin

    2016-01-01

    Evapotranspiration (ET) processes at the leaf to landscape scales in multiple land uses have important controls and feedbacks for local, regional, and global climate and water resource systems. Innovative methods, tools, and technologies for improved understanding and quantification of ET and crop water use are critical for adapting more effective management strategies...

  14. Understanding the transmission dynamics of Leishmania donovani to provide robust evidence for interventions to eliminate visceral leishmaniasis in Bihar, India

    USDA-ARS?s Scientific Manuscript database

    Molecular tools enable the collection of accurate estimates of human blood index (HBI) in P. argentipes. The refinement of a metacyclic-specific qPCR assay to identify L. donovani in P. argentipes would enable quantification of the entomological inoculation rate (EIR) for the first time. Likewise, a...

  15. CHEMICALLY ACTIVATED LUCIFASE GENE EXPRESSION (CALUX) CELL BIOASSAY ANALYSIS FOR THE ESTIMATION OF DIOXIN-LIKE ACTIVITIY: CRITICAL PARAMETERS OF THE CALUX PROCEDURE THAT IMPACT ASSAY RESULTS

    EPA Science Inventory

    The Chemically Activated Luciferase gene expression (CALUX) in vitro cell bioassay is an emerging bioanalytical tool that is increasingly being used for the screening and relative quantification of dioxins and dioxin-like compounds. Since CALUX analyses provide a biological respo...

  16. chipPCR: an R package to pre-process raw data of amplification curves.

    PubMed

    Rödiger, Stefan; Burdukiewicz, Michał; Schierack, Peter

    2015-09-01

    Both the quantitative real-time polymerase chain reaction (qPCR) and quantitative isothermal amplification (qIA) are standard methods for nucleic acid quantification. Numerous real-time read-out technologies have been developed. Despite the continuous interest in amplification-based techniques, there are only few tools for pre-processing of amplification data. However, a transparent tool for precise control of raw data is indispensable in several scenarios, for example, during the development of new instruments. chipPCR is an R: package for the pre-processing and quality analysis of raw data of amplification curves. The package takes advantage of R: 's S4 object model and offers an extensible environment. chipPCR contains tools for raw data exploration: normalization, baselining, imputation of missing values, a powerful wrapper for amplification curve smoothing and a function to detect the start and end of an amplification curve. The capabilities of the software are enhanced by the implementation of algorithms unavailable in R: , such as a 5-point stencil for derivative interpolation. Simulation tools, statistical tests, plots for data quality management, amplification efficiency/quantification cycle calculation, and datasets from qPCR and qIA experiments are part of the package. Core functionalities are integrated in GUIs (web-based and standalone shiny applications), thus streamlining analysis and report generation. http://cran.r-project.org/web/packages/chipPCR. Source code: https://github.com/michbur/chipPCR. stefan.roediger@b-tu.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. CORSEN, a new software dedicated to microscope-based 3D distance measurements: mRNA-mitochondria distance, from single-cell to population analyses.

    PubMed

    Jourdren, Laurent; Delaveau, Thierry; Marquenet, Emelie; Jacq, Claude; Garcia, Mathilde

    2010-07-01

    Recent improvements in microscopy technology allow detection of single molecules of RNA, but tools for large-scale automatic analyses of particle distributions are lacking. An increasing number of imaging studies emphasize the importance of mRNA localization in the definition of cell territory or the biogenesis of cell compartments. CORSEN is a new tool dedicated to three-dimensional (3D) distance measurements from imaging experiments especially developed to access the minimal distance between RNA molecules and cellular compartment markers. CORSEN includes a 3D segmentation algorithm allowing the extraction and the characterization of the cellular objects to be processed--surface determination, aggregate decomposition--for minimal distance calculations. CORSEN's main contribution lies in exploratory statistical analysis, cell population characterization, and high-throughput assays that are made possible by the implementation of a batch process analysis. We highlighted CORSEN's utility for the study of relative positions of mRNA molecules and mitochondria: CORSEN clearly discriminates mRNA localized to the vicinity of mitochondria from those that are translated on free cytoplasmic polysomes. Moreover, it quantifies the cell-to-cell variations of mRNA localization and emphasizes the necessity for statistical approaches. This method can be extended to assess the evolution of the distance between specific mRNAs and other cellular structures in different cellular contexts. CORSEN was designed for the biologist community with the concern to provide an easy-to-use and highly flexible tool that can be applied for diverse distance quantification issues.

  18. DYNAMO-HIA–A Dynamic Modeling Tool for Generic Health Impact Assessments

    PubMed Central

    Lhachimi, Stefan K.; Nusselder, Wilma J.; Smit, Henriette A.; van Baal, Pieter; Baili, Paolo; Bennett, Kathleen; Fernández, Esteve; Kulik, Margarete C.; Lobstein, Tim; Pomerleau, Joceline; Mackenbach, Johan P.; Boshuizen, Hendriek C.

    2012-01-01

    Background Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA). A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states) and three usability criteria (modest data requirements, rich model output, generally accessible) to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment), we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. Methods and Results DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures – e.g. life expectancy and disease-free life expectancy – and detailed data – e.g. prevalences and mortality/survival rates – by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. Conclusion By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based on epidemiologic evidence. PMID:22590491

  19. Comparison of tobacco control scenarios: quantifying estimates of long-term health impact using the DYNAMO-HIA modeling tool.

    PubMed

    Kulik, Margarete C; Nusselder, Wilma J; Boshuizen, Hendriek C; Lhachimi, Stefan K; Fernández, Esteve; Baili, Paolo; Bennett, Kathleen; Mackenbach, Johan P; Smit, H A

    2012-01-01

    There are several types of tobacco control interventions/policies which can change future smoking exposure. The most basic intervention types are 1) smoking cessation interventions 2) preventing smoking initiation and 3) implementation of a nationwide policy affecting quitters and starters simultaneously. The possibility for dynamic quantification of such different interventions is key for comparing the timing and size of their effects. We developed a software tool, DYNAMO-HIA, which allows for a quantitative comparison of the health impact of different policy scenarios. We illustrate the outcomes of the tool for the three typical types of tobacco control interventions if these were applied in the Netherlands. The tool was used to model the effects of different types of smoking interventions on future smoking prevalence and on health outcomes, comparing these three scenarios with the business-as-usual scenario. The necessary data input was obtained from the DYNAMO-HIA database which was assembled as part of this project. All smoking interventions will be effective in the long run. The population-wide strategy will be most effective in both the short and long term. The smoking cessation scenario will be second-most effective in the short run, though in the long run the smoking initiation scenario will become almost as effective. Interventions aimed at preventing the initiation of smoking need a long time horizon to become manifest in terms of health effects. The outcomes strongly depend on the groups targeted by the intervention. We calculated how much more effective the population-wide strategy is, in both the short and long term, compared to quit smoking interventions and measures aimed at preventing the initiation of smoking. By allowing a great variety of user-specified choices, the DYNAMO-HIA tool is a powerful instrument by which the consequences of different tobacco control policies and interventions can be assessed.

  20. Comparison of Tobacco Control Scenarios: Quantifying Estimates of Long-Term Health Impact Using the DYNAMO-HIA Modeling Tool

    PubMed Central

    Kulik, Margarete C.; Nusselder, Wilma J.; Boshuizen, Hendriek C.; Lhachimi, Stefan K.; Fernández, Esteve; Baili, Paolo; Bennett, Kathleen; Mackenbach, Johan P.; Smit, H. A.

    2012-01-01

    Background There are several types of tobacco control interventions/policies which can change future smoking exposure. The most basic intervention types are 1) smoking cessation interventions 2) preventing smoking initiation and 3) implementation of a nationwide policy affecting quitters and starters simultaneously. The possibility for dynamic quantification of such different interventions is key for comparing the timing and size of their effects. Methods and Results We developed a software tool, DYNAMO-HIA, which allows for a quantitative comparison of the health impact of different policy scenarios. We illustrate the outcomes of the tool for the three typical types of tobacco control interventions if these were applied in the Netherlands. The tool was used to model the effects of different types of smoking interventions on future smoking prevalence and on health outcomes, comparing these three scenarios with the business-as-usual scenario. The necessary data input was obtained from the DYNAMO-HIA database which was assembled as part of this project. All smoking interventions will be effective in the long run. The population-wide strategy will be most effective in both the short and long term. The smoking cessation scenario will be second-most effective in the short run, though in the long run the smoking initiation scenario will become almost as effective. Interventions aimed at preventing the initiation of smoking need a long time horizon to become manifest in terms of health effects. The outcomes strongly depend on the groups targeted by the intervention. Conclusion We calculated how much more effective the population-wide strategy is, in both the short and long term, compared to quit smoking interventions and measures aimed at preventing the initiation of smoking. By allowing a great variety of user-specified choices, the DYNAMO-HIA tool is a powerful instrument by which the consequences of different tobacco control policies and interventions can be assessed. PMID:22384230

  1. Statistical modeling of isoform splicing dynamics from RNA-seq time series data.

    PubMed

    Huang, Yuanhua; Sanguinetti, Guido

    2016-10-01

    Isoform quantification is an important goal of RNA-seq experiments, yet it remains problematic for genes with low expression or several isoforms. These difficulties may in principle be ameliorated by exploiting correlated experimental designs, such as time series or dosage response experiments. Time series RNA-seq experiments, in particular, are becoming increasingly popular, yet there are no methods that explicitly leverage the experimental design to improve isoform quantification. Here, we present DICEseq, the first isoform quantification method tailored to correlated RNA-seq experiments. DICEseq explicitly models the correlations between different RNA-seq experiments to aid the quantification of isoforms across experiments. Numerical experiments on simulated datasets show that DICEseq yields more accurate results than state-of-the-art methods, an advantage that can become considerable at low coverage levels. On real datasets, our results show that DICEseq provides substantially more reproducible and robust quantifications, increasing the correlation of estimates from replicate datasets by up to 10% on genes with low or moderate expression levels (bottom third of all genes). Furthermore, DICEseq permits to quantify the trade-off between temporal sampling of RNA and depth of sequencing, frequently an important choice when planning experiments. Our results have strong implications for the design of RNA-seq experiments, and offer a novel tool for improved analysis of such datasets. Python code is freely available at http://diceseq.sf.net G.Sanguinetti@ed.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Perfusion quantification in contrast-enhanced ultrasound (CEUS)--ready for research projects and routine clinical use.

    PubMed

    Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M

    2012-07-01

    With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.

  3. Quantification of Fel d 1 in house dust samples of cat allergic patients by using monoclonal antibody specific to a novel IgE-binding epitope.

    PubMed

    Tasaniyananda, Natt; Tungtrongchitr, Anchalee; Seesuay, Watee; Sakolvaree, Yuwaporn; Aiumurai, Pisinee; Indrawattana, Nitaya; Chaicumpa, Wanpen; Sookrung, Nitat

    2018-03-01

    Avoidance of allergen exposure is an effective measure for preventing naÏve and allergic individuals from sensitization (primary intervention) and disease aggravation (secondary intervention), respectively. Regular monitoring of the allergens in the environment is required for the effective intervention. Thus, there is a need for cost-effective test kits for environmental allergen quantifications. To invent a test kit for quantification of cat major allergen, Fel d 1. A mouse monoclonal antibody (MAb) specific to the newly identified IgE-binding conformational epitope of the cat major allergen (Fel d 1) and rabbit polyclonal IgG to recombinant Fel d 1 were used as allergen capture and detection reagents, respectively. Native Fel d 1 was used in constructing a standard curve. Sixteen of 36 dust samples collected from houses of cat allergic subjects in Bangkok contained Fel d 1 above 0.29 μg/gram of dust which is considered as a novel threshold level for causing cat allergy sensitization or symptoms. Among them, 7 samples contained the allergen exceeding 2.35 μg/gram of dust which is the level that would aggravate asthma. Results of the allergen quantification using the locally made test kit showed strong correlation (r = 0.923) with the allergen quantification using commercialized reagents. The assay using MAb to Fel d 1 IgE-binding epitope of this study has potential application as an economic and practical tool for cat allergy intervention measure especially in localities where health resources are relatively limited.

  4. A simplified risk-ranking system for prioritizing toxic pollution sites in low- and middle-income countries.

    PubMed

    Caravanos, Jack; Gualtero, Sandra; Dowling, Russell; Ericson, Bret; Keith, John; Hanrahan, David; Fuller, Richard

    2014-01-01

    In low- and middle-income countries (LMICs), chemical exposures in the environment due to hazardous waste sites and toxic pollutants are typically poorly documented and their health impacts insufficiently quantified. Furthermore, there often is only limited understanding of the health and environmental consequences of point source pollution problems, and little consensus on how to assess and rank them. The contributions of toxic environmental exposures to the global burden of disease are not well characterized. The aim of this study was to describe the simple but effective approach taken by Blacksmith Institute's Toxic Sites Identification Program to quantify and rank toxic exposures in LMICs. This system is already in use at more than 3000 sites in 48 countries such as India, Indonesia, China, Ghana, Kenya, Tanzania, Peru, Bolivia, Argentina, Uruguay, Armenia, Azerbaijan, and Ukraine. A hazard ranking system formula, the Blacksmith Index (BI), takes into account important factors such as the scale of the pollution source, the size of the population possibly affected, and the exposure pathways, and is designed for use reliably in low-resource settings by local personnel provided with limited training. Four representative case studies are presented, with varying locations, populations, pollutants, and exposure pathways. The BI was successfully applied to assess the extent and severity of environmental pollution problems at these sites. The BI is a risk-ranking tool that provides direct and straightforward characterization, quantification, and prioritization of toxic pollution sites in settings where time, money, or resources are limited. It will be an important and useful tool for addressing toxic pollution problems in LMICs. Although the BI does not have the sophistication of the US Environmental Protection Agency's Hazard Ranking System, the case studies presented here document the effectiveness of the BI in the field, especially in low-resource settings. Understanding of the risks posed by toxic pollution sites helps assure better use of resources to manage sites and mitigate risks to public health. Quantification of these hazards is an important input to assessments of the global burden of disease. Copyright © 2014 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  5. Magnetic nanoparticles in different biological environments analyzed by magnetic particle spectroscopy

    NASA Astrophysics Data System (ADS)

    Löwa, Norbert; Seidel, Maria; Radon, Patricia; Wiekhorst, Frank

    2017-04-01

    Quantification of magnetic iron oxide nanoparticles (MNP) in biological systems like cells, tissue, or organs is of vital importance for development of novel biomedical applications, e.g. magnetofection, drug targeting or hyperthermia. Among others, the recently developed magnetic measurement technique magnetic particle spectroscopy (MPS) provides signals that are specific for MNP. MPS is based on the non-linear magnetic response of MNP exposed to a strong sinusoidal excitation field of up to 25 mT amplitude and 25 kHz frequency. So far, it has been proven a powerful tool for quantification of MNP in biological systems. In this study we investigated in detail the influence of typical biological media on the magnetic behavior of different MNP systems by MPS. The results reveal that amplitude and shape (ratio of harmonics) of the MPS spectra allow for perceptively monitoring changes in MNP magnetism caused by different physiological media. Additionally, the observed linear correlation between MPS amplitude and shape alterations can be used to reduce the quantification uncertainty for MNP suspended in a biological environment.

  6. Uncertainty Quantification and Certification Prediction of Low-Boom Supersonic Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Reuter, Bryan W.; Walker, Eric L.; Kleb, Bil; Park, Michael A.

    2014-01-01

    The primary objective of this work was to develop and demonstrate a process for accurate and efficient uncertainty quantification and certification prediction of low-boom, supersonic, transport aircraft. High-fidelity computational fluid dynamics models of multiple low-boom configurations were investigated including the Lockheed Martin SEEB-ALR body of revolution, the NASA 69 Delta Wing, and the Lockheed Martin 1021-01 configuration. A nonintrusive polynomial chaos surrogate modeling approach was used for reduced computational cost of propagating mixed, inherent (aleatory) and model-form (epistemic) uncertainty from both the computation fluid dynamics model and the near-field to ground level propagation model. A methodology has also been introduced to quantify the plausibility of a design to pass a certification under uncertainty. Results of this study include the analysis of each of the three configurations of interest under inviscid and fully turbulent flow assumptions. A comparison of the uncertainty outputs and sensitivity analyses between the configurations is also given. The results of this study illustrate the flexibility and robustness of the developed framework as a tool for uncertainty quantification and certification prediction of low-boom, supersonic aircraft.

  7. Quantification of early cutaneous manifestations of chronic venous insufficiency by automated analysis of photographic images: Feasibility and technical considerations.

    PubMed

    Becker, François; Fourgeau, Patrice; Carpentier, Patrick H; Ouchène, Amina

    2018-06-01

    We postulate that blue telangiectasia and brownish pigmentation at ankle level, early markers of chronic venous insufficiency, can be quantified for longitudinal studies of chronic venous disease in Caucasian people. Objectives and methods To describe a photographic technique specially developed for this purpose. The pictures were acquired using a dedicated photo stand to position the foot in a reproducible way, with a normalized lighting and acquisition protocol. The image analysis was performed with a tool developed using algorithms optimized to detect and quantify blue telangiectasia and brownish pigmentation and their relative surface in the region of interest. To test the short-term reproducibility of the measures. Results The quantification of the blue telangiectasia and of the brownish pigmentation using an automated digital photo analysis is feasible. The short-term reproducibility is good for blue telangiectasia quantification. It is a less accurate for the brownish pigmentation. Conclusion The blue telangiectasia of the corona phlebectatica and the ankle flare can be assessed using a clinimetric approach based on the automated digital photo analysis.

  8. Customized Consensus Spectral Library Building for Untargeted Quantitative Metabolomics Analysis with Data Independent Acquisition Mass Spectrometry and MetaboDIA Workflow.

    PubMed

    Chen, Gengbo; Walmsley, Scott; Cheung, Gemmy C M; Chen, Liyan; Cheng, Ching-Yu; Beuerman, Roger W; Wong, Tien Yin; Zhou, Lei; Choi, Hyungwon

    2017-05-02

    Data independent acquisition-mass spectrometry (DIA-MS) coupled with liquid chromatography is a promising approach for rapid, automatic sampling of MS/MS data in untargeted metabolomics. However, wide isolation windows in DIA-MS generate MS/MS spectra containing a mixed population of fragment ions together with their precursor ions. This precursor-fragment ion map in a comprehensive MS/MS spectral library is crucial for relative quantification of fragment ions uniquely representative of each precursor ion. However, existing reference libraries are not sufficient for this purpose since the fragmentation patterns of small molecules can vary in different instrument setups. Here we developed a bioinformatics workflow called MetaboDIA to build customized MS/MS spectral libraries using a user's own data dependent acquisition (DDA) data and to perform MS/MS-based quantification with DIA data, thus complementing conventional MS1-based quantification. MetaboDIA also allows users to build a spectral library directly from DIA data in studies of a large sample size. Using a marine algae data set, we show that quantification of fragment ions extracted with a customized MS/MS library can provide as reliable quantitative data as the direct quantification of precursor ions based on MS1 data. To test its applicability in complex samples, we applied MetaboDIA to a clinical serum metabolomics data set, where we built a DDA-based spectral library containing consensus spectra for 1829 compounds. We performed fragment ion quantification using DIA data using this library, yielding sensitive differential expression analysis.

  9. Quantification of maltol in Korean ginseng (Panax ginseng) products by high-performance liquid chromatography-diode array detector

    PubMed Central

    Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won

    2015-01-01

    Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746

  10. Mass spectrometry–based relative quantification of proteins in precatalytic and catalytically active spliceosomes by metabolic labeling (SILAC), chemical labeling (iTRAQ), and label-free spectral count

    PubMed Central

    Schmidt, Carla; Grønborg, Mads; Deckert, Jochen; Bessonov, Sergey; Conrad, Thomas; Lührmann, Reinhard; Urlaub, Henning

    2014-01-01

    The spliceosome undergoes major changes in protein and RNA composition during pre-mRNA splicing. Knowing the proteins—and their respective quantities—at each spliceosomal assembly stage is critical for understanding the molecular mechanisms and regulation of splicing. Here, we applied three independent mass spectrometry (MS)–based approaches for quantification of these proteins: (1) metabolic labeling by SILAC, (2) chemical labeling by iTRAQ, and (3) label-free spectral count for quantification of the protein composition of the human spliceosomal precatalytic B and catalytic C complexes. In total we were able to quantify 157 proteins by at least two of the three approaches. Our quantification shows that only a very small subset of spliceosomal proteins (the U5 and U2 Sm proteins, a subset of U5 snRNP-specific proteins, and the U2 snRNP-specific proteins U2A′ and U2B′′) remains unaltered upon transition from the B to the C complex. The MS-based quantification approaches classify the majority of proteins as dynamically associated specifically with the B or the C complex. In terms of experimental procedure and the methodical aspect of this work, we show that metabolically labeled spliceosomes are functionally active in terms of their assembly and splicing kinetics and can be utilized for quantitative studies. Moreover, we obtain consistent quantification results from all three methods, including the relatively straightforward and inexpensive label-free spectral count technique. PMID:24448447

  11. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  12. Quantification of taurine in energy drinks using ¹H NMR.

    PubMed

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Simultaneous quantification of Δ9-tetrahydrocannabinol, 11-hydroxy-Δ9-tetrahydrocannabinol, and 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid in human plasma using two-dimensional gas chromatography, cryofocusing, and electron impact-mass spectrometry

    PubMed Central

    Lowe, Ross H.; Karschner, Erin L.; Schwilke, Eugene W.; Barnes, Allan J.; Huestis, Marilyn A.

    2009-01-01

    A two-dimensional (2D) gas chromatography/electron impact-mass spectrometry (GC/EI-MS) method for simultaneous quantification of Δ9-tetrahydrocannabinol (THC), 11-hydroxy-Δ9-tetrahydrocannabinol (11-OH-THC), and 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid (THCCOOH) in human plasma was developed and validated. The method employs 2D capillary GC and cryofocusing for enhanced resolution and sensitivity. THC, 11-OH-THC, and THCCOOH were extracted by precipitation with acetonitrile followed by solid-phase extraction. GC separation of trimethylsilyl derivatives of analytes was accomplished with two capillary columns in series coupled via a pneumatic Deans switch system. Detection and quantification were accomplished with a bench-top single quadrupole mass spectrometer operated in electron impact-selected ion monitoring mode. Limits of quantification (LOQ) were 0.125, 0.25 and 0.125 ng/mL for THC, 11-OH-THC, and THCCOOH, respectively. Accuracy ranged from 86.0 to 113.0% for all analytes. Intra- and inter-assay precision, as percent relative standard deviation, was less than 14.1% for THC, 11-OH-THC, and THCCOOH. The method was successfully applied to quantification of THC and its 11-OH-THC and THCCOOH metabolites in plasma specimens following controlled administration of THC. PMID:17640656

  14. Ranking Fragment Ions Based on Outlier Detection for Improved Label-Free Quantification in Data-Independent Acquisition LC-MS/MS

    PubMed Central

    Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard

    2016-01-01

    Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574

  15. New approach for the quantification of processed animal proteins in feed using light microscopy.

    PubMed

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  16. Quantification of methionine and selenomethionine in biological samples using multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS).

    PubMed

    Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel

    2018-05-01

    Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.

  17. Aerosol-type retrieval and uncertainty quantification from OMI data

    NASA Astrophysics Data System (ADS)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model selection. The posterior probability distribution can provide a comprehensive characterisation of the uncertainty in this kind of problem for aerosol-type selection. As a result, the proposed method can account for the model error and also include the model selection uncertainty in the total uncertainty budget.

  18. Quantification of a biomarker of acetaminophen protein adducts in human serum by high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry: clinical and animal model applications.

    PubMed

    Cook, Sarah F; King, Amber D; Chang, Yan; Murray, Gordon J; Norris, Hye-Ryun K; Dart, Richard C; Green, Jody L; Curry, Steven C; Rollins, Douglas E; Wilkins, Diana G

    2015-03-15

    The aims of this study were to develop, validate, and apply a high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) method for quantification of protein-derived 3-(cystein-S-yl)-acetaminophen (APAP-Cys) in human serum. Formation of acetaminophen (APAP) protein adducts is thought to be a critical, early event in the development of APAP-induced hepatotoxicity, and quantification of these protein adducts in human serum represents a valuable tool for assessment of APAP exposure, metabolism, and toxicity. In the reported procedure, serum samples were first dialyzed or passed through gel filtration columns to remove APAP-Cys not covalently bound to proteins. Serum eluates were then subjected to enzymatic protease digestion to liberate protein-bound APAP-Cys. Norbuprenorphine-D3 was utilized as an internal standard (IS). APAP-Cys and IS were recovered from digested serum by protein precipitation with acetonitrile, and sample extracts were analyzed by HPLC-ESI-MS/MS. The method was validated by assessment of intra- and inter-assay accuracy and imprecision on two different analytical instrument platforms. APAP-Cys could be accurately quantified from 0.010 to 10μM, and intra- and inter-assay imprecision were <15% on both analytical instruments. APAP-Cys was stable in human serum for three freeze-thaw cycles and for 24h at ambient temperature. Extracted samples were stable when stored in refrigerated autosamplers for the typical duration of analysis or when stored at -20°C for six days. Results from process efficiency and matrix effect experiments indicated adequate recovery from human serum and insignificant ion suppression or enhancement. The utility and sensitivity of the reported procedure were illustrated by analysis of clinical samples collected from subjects taking chronic, therapeutic doses of APAP. Applicability to other biological matrices was also demonstrated by measurement of protein-derived APAP-Cys in plasma collected from APAP-treated mice, a common animal model of APAP-induced hepatotoxicity. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. A quantitative, selective and fast ultra-high performance liquid chromatography tandem mass spectrometry method for the simultaneous analysis of 33 basic drugs in hair (amphetamines, cocaine, opiates, opioids and metabolites).

    PubMed

    Fernández, María Del Mar Ramírez; Di Fazio, Vincent; Wille, Sarah M R; Kummer, Natalie; Samyn, Nele

    2014-08-15

    Forensic testing for drugs of abuse in hair has become a useful diagnostic tool in determining chronic drug use as well as examining long-term drug history thorough segmental analysis. However, sensitive and specific analytical methods are needed. A simple, rapid and highly sensitive and specific method for the extraction and quantification of 33 opioids, opiates, cocaine, and amphetamines is presented. The method was fully validated according to international guidelines. Twenty milligrams of hair sample was pulverized and then incubated in the same disposable tube with methanol (under sonication at 45°C) during 4h. After centrifugation the supernatant was evaporated up to about 100 μL and a solid phase extraction (SPE) followed by separation and quantification using ultra performance liquid chromatography-tandem mass spectrometry (UHLC-MS/MS) were carried out. Chromatographic separation was achieved using a BEH phenyl column eluted with 0.1% formic acid: methanol (0.1% formic acid). Selectivity of the method was achieved by a combination of retention time, and two precursor-product ion transitions. Good intra-assay and inter-assay precision (relative standard deviations (RSDs) were observed (<15%) for most of the compounds. The lower limit of quantification was fixed at the lowest calibrator in the linearity experiments and it ranged from 0.006 to 0.063 ng/mg. No instability was observed in processed samples. Extraction efficiency varied from 37 to 107% (except for EDDP with a recovery of 5%) and matrix effects ranged from 52 to 160%, and for most of the compounds it was compensated by the internal standard (IS). The method was subsequently applied to authentic hair samples obtained from forensic and toxicology cases and to proficiency test (obtaining z-scores lower than 1 for most of the compounds). The validation and actual sample analysis results show that this method is rugged, precise, accurate, and well-suited for routine hair analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio; Biamonte, Jacob

    2016-10-01

    Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.

Top