Science.gov

Sample records for quantification spatialisation vulnerabilite

  1. Das Assessment von Vulnerabilitäten, Risiken und Unsicherheiten

    NASA Astrophysics Data System (ADS)

    Birkmann, Jörn; Greiving, Stefan; Serdeczny, Olivia Maria

    Die Risiken und möglichen Folgen des Klimawandels für Menschen, Produktions- und Ökosysteme sind eng mit sozioökonomischen Entwicklungen und Rahmenbedingungen verflochten. Die Schlüsselbegriffe "Vulnerabilität", "Risiko" und "Unsicherheit" werden näher beleuchtet, um u. a. deutlich zu machen, wie sie im neueren Risikoansatz des Fünften Sachstandsberichts (AR5) des Weltklimarats (IPCC) genutzt werden. Das Risikokonzept wird vom Vulnerabilitätskonzept unterschieden. In den Fokus rückt die Betrachtung von Gefahr und Exposition. Auch die Frage, was unter Unsicherheit und Bandbreiten möglicher Entwicklungen des Klimas und sogenannter sozioökonomischer Entwicklungspfade zu verstehen ist, spielt dabei eine wichtige Rolle. Bisherige Untersuchungsmethoden zu Risiken im Kontext des Klimawandels und darauf aufbauende Entscheidungsprozesse werden im Hinblick auf künftige Anpassungsmaßnahmen diskutiert.

  2. Race, space, place: notes on the racialisation and spatialisation of commercial sex work in Dubai, UAE.

    PubMed

    Mahdavi, Pardis

    2010-11-01

    This paper focuses on the perceived racialisation and resultant spatialisation of commercial sex in Dubai. In recent years, the sex industry in Dubai has grown to include women from the Middle East, Eastern Europe, East Asia and Africa. With the increase in sex workers of different nationalities has come a form of localised racism that is embedded in structures and desires seen within specific locations. The physical spatialisation of sex work hinges on perceived race and produces distinct income generating potential for women engaged in the sex industry in Dubai. The social and physical topography of Dubai is important in marginalising or privileging these various groups of sex workers, which correlates race, space and place with rights and assistance. I begin with a description of the multidirectional flows of causality between race, space, place and demand. I then discuss how these various groups are inversely spatialised within the discourse on assistance, protection and rights. The findings presented here are based on ethnographic research conducted with transnational migrants in the UAE in 2004, 2008 and 2009.

  3. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  4. Calculation of Agricultural Nitrogen Quantity for EU15, spatialisation of the results to river basins using CORINE Land Cover

    NASA Astrophysics Data System (ADS)

    Campling, P.; Terres, J. M.; Vandewalle, S.; Crouzet, P.

    2003-04-01

    The objective of the study was the implementation of the OECD/Eurostat Soil Surface balance method to calculate nitrogen balances from agricultural sources for the whole European Union (EU) at administrative or river basins level. This methodology combines the use of statistics on crops area, number of animals together with agronomic technical coefficients and Corine Land Cover data to spatialise agricultural nitrogen quantity through spatial modelling. Results on catchments show an EU average surplus level of 60 kg N/ha. The distribution of the balances shows high surplus amounts in regions of intensive livestock farming (Flanders (B), the Netherlands, Brittany (FR)), and low or deficit values in the central areas of Spain, France and Italy. The effect of the Corine Land Cover in nitrogen balance calculations was also examined through scenario analysis. These simulations indicated a slight improvement in estimation when the Corine Land Cover was used to spatialise the results of the Soil surface balance model. A sensitivity analysis of the technical coefficients was also carried out and showed a higher sensitivity of the model to crop related coefficients than manure coefficients. The overall sensitivity analysis revealed the need to improve the quality of the technical coefficients, requiring more consistency and harmonisation and moreover reflecting regional differences.

  5. Dystrophin quantification

    PubMed Central

    Anthony, Karen; Arechavala-Gomeza, Virginia; Taylor, Laura E.; Vulin, Adeline; Kaminoh, Yuuki; Torelli, Silvia; Feng, Lucy; Janghra, Narinder; Bonne, Gisèle; Beuvin, Maud; Barresi, Rita; Henderson, Matt; Laval, Steven; Lourbakos, Afrodite; Campion, Giles; Straub, Volker; Voit, Thomas; Sewry, Caroline A.; Morgan, Jennifer E.; Flanigan, Kevin M.

    2014-01-01

    Objective: We formed a multi-institution collaboration in order to compare dystrophin quantification methods, reach a consensus on the most reliable method, and report its biological significance in the context of clinical trials. Methods: Five laboratories with expertise in dystrophin quantification performed a data-driven comparative analysis of a single reference set of normal and dystrophinopathy muscle biopsies using quantitative immunohistochemistry and Western blotting. We developed standardized protocols and assessed inter- and intralaboratory variability over a wide range of dystrophin expression levels. Results: Results from the different laboratories were highly concordant with minimal inter- and intralaboratory variability, particularly with quantitative immunohistochemistry. There was a good level of agreement between data generated by immunohistochemistry and Western blotting, although immunohistochemistry was more sensitive. Furthermore, mean dystrophin levels determined by alternative quantitative immunohistochemistry methods were highly comparable. Conclusions: Considering the biological function of dystrophin at the sarcolemma, our data indicate that the combined use of quantitative immunohistochemistry and Western blotting are reliable biochemical outcome measures for Duchenne muscular dystrophy clinical trials, and that standardized protocols can be comparable between competent laboratories. The methodology validated in our study will facilitate the development of experimental therapies focused on dystrophin production and their regulatory approval. PMID:25355828

  6. Quantification of endogenous retinoids.

    PubMed

    Kane, Maureen A; Napoli, Joseph L

    2010-01-01

    Numerous physiological processes require retinoids, including development, nervous system function, immune responsiveness, proliferation, differentiation, and all aspects of reproduction. Reliable retinoid quantification requires suitable handling and, in some cases, resolution of geometric isomers that have different biological activities. Here we describe procedures for reliable and accurate quantification of retinoids, including detailed descriptions for handling retinoids, preparing standard solutions, collecting samples and harvesting tissues, extracting samples, resolving isomers, and detecting with high sensitivity. Sample-specific strategies are provided for optimizing quantification. Approaches to evaluate assay performance also are provided. Retinoid assays described here for mice also are applicable to other organisms including zebrafish, rat, rabbit, and human and for cells in culture. Retinoid quantification, especially that of retinoic acid, should provide insight into many diseases, including Alzheimer's disease, type 2 diabetes, obesity, and cancer.

  7. Quantification of micro stickies

    Treesearch

    Mahendra. Doshi; Jeffrey. Dyer; Salman. Aziz; Kristine. Jackson; Said M. Abubakr

    1997-01-01

    The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...

  8. Quantificational logic of context

    SciTech Connect

    Buvac, Sasa

    1996-12-31

    In this paper we extend the Propositional Logic of Context, to the quantificational (predicate calculus) case. This extension is important in the declarative representation of knowledge for two reasons. Firstly, since contexts are objects in the semantics which can be denoted by terms in the language and which can be quantified over, the extension enables us to express arbitrary first-order properties of contexts. Secondly, since the extended language is no longer only propositional, we can express that an arbitrary predicate calculus formula is true in a context. The paper describes the syntax and the semantics of a quantificational language of context, gives a Hilbert style formal system, and outlines a proof of the system`s completeness.

  9. Semiautomatic quantification of angiogenesis.

    PubMed

    Boettcher, Markus; Gloe, Torsten; de Wit, Cor

    2010-07-01

    Angiogenesis is of major interest in developmental biology and cancer research. Different experimental approaches are available to study angiogenesis that have in common the need for microscopy, image acquisition, and analysis. Problems that are encountered hereby are the size of the structures, which requires generation of composite images and difficulties in quantifying angiogenic activity reliably and rapidly. Most graphic software packages lack some of the required functions for easy, semiautomatic quantification of angiogenesis and, consequently, multiple software packages or expensive programs have to be used to cover all necessary functions. A software package (AQuaL) to analyze angiogenic activity was developed using Java, which can be used platform-independently. It includes image acquisition relying on the Java Media Framework and an easy to use image alignment tool. Multiple overlapping images can be aligned and saved without limitations and loss of resolution into a composite image, which requires only the selection of a single point representing a characteristic structure in adjacent images. Angiogenic activity can be quantified in composite images semiautomatically by the assessment of the area overgrown by cells after filtering and image binarization. In addition, tagging of capillary-like structures allows quantification of their length and branching pattern. Both developed methods deliver reliable and correlating data as exemplified in the aortic ring angiogenesis assay. The developed software provides modular functions specifically targeted to quantify angiogenesis. Whereas the area measurement is time saving, length measurement provides additional information about the branching patterns, which is required for a qualitative differentiation of capillary growth. (c) 2010 Elsevier Inc. All rights reserved.

  10. Wrappers, Aspects, Quantification and Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2005-01-01

    Talk overview: Object infrastructure framework (OIF). A system development to simplify building distributed applications by allowing independent implementation of multiple concern. Essence and state of AOP. Trinity. Quantification over events. Current work on a generalized AOP technology.

  11. Uncertainty Quantification in Aeroelasticity

    NASA Astrophysics Data System (ADS)

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  12. Quantification of human responses

    NASA Technical Reports Server (NTRS)

    Steinlage, R. C.; Gantner, T. E.; Lim, P. Y. W.

    1992-01-01

    Human perception is a complex phenomenon which is difficult to quantify with instruments. For this reason, large panels of people are often used to elicit and aggregate subjective judgments. Print quality, taste, smell, sound quality of a stereo system, softness, and grading Olympic divers and skaters are some examples of situations where subjective measurements or judgments are paramount. We usually express what is in our mind through language as a medium but languages are limited in available choices of vocabularies, and as a result, our verbalizations are only approximate expressions of what we really have in mind. For lack of better methods to quantify subjective judgments, it is customary to set up a numerical scale such as 1, 2, 3, 4, 5 or 1, 2, 3, ..., 9, 10 for characterizing human responses and subjective judgments with no valid justification except that these scales are easy to understand and convenient to use. But these numerical scales are arbitrary simplifications of the complex human mind; the human mind is not restricted to such simple numerical variations. In fact, human responses and subjective judgments are psychophysical phenomena that are fuzzy entities and therefore difficult to handle by conventional mathematics and probability theory. The fuzzy mathematical approach provides a more realistic insight into understanding and quantifying human responses. This paper presents a method for quantifying human responses and subjective judgments without assuming a pattern of linear or numerical variation for human responses. In particular, quantification and evaluation of linguistic judgments was investigated.

  13. Micro-RNA quantification using DNA polymerase and pyrophosphate quantification.

    PubMed

    Yu, Hsiang-Ping; Hsiao, Yi-Ling; Pan, Hung-Yin; Huang, Chih-Hung; Hou, Shao-Yi

    2011-12-15

    A rapid quantification method for micro-RNA based on DNA polymerase activity and pyrophosphate quantification has been developed. The tested micro-RNA serves as the primer, unlike the DNA primer in all DNA sequencing methods, and the DNA probe serves as the template for DNA replication. After the DNA synthesis, the pyrophosphate detection and quantification indicate the existence and quantity of the tested miRNA. Five femtomoles of the synthetic RNA could be detected. In 20-100 μg RNA samples purified from SiHa cells, the measurement was done using the proposed assay in which hsa-miR-16 and hsa-miR-21 are 0.34 fmol/μg RNA and 0.71 fmol/μg RNA, respectively. This simple and inexpensive assay takes less than 5 min after total RNA purification and preparation. The quantification is not affected by the pre-miRNA which cannot serve as the primer for the DNA synthesis in this assay. This assay is general for the detection of the target RNA or DNA with a known matched DNA template probe, which could be widely used for detection of small RNA, messenger RNA, RNA viruses, and DNA. Therefore, the method could be widely used in RNA and DNA assays.

  14. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  15. Absolute quantification of myocardial blood flow.

    PubMed

    Yoshinaga, Keiichiro; Manabe, Osamu; Tamaki, Nagara

    2016-07-21

    With the increasing availability of positron emission tomography (PET) myocardial perfusion imaging, the absolute quantification of myocardial blood flow (MBF) has become popular in clinical settings. Quantitative MBF provides an important additional diagnostic or prognostic information over conventional visual assessment. The success of MBF quantification using PET/computed tomography (CT) has increased the demand for this quantitative diagnostic approach to be more accessible. In this regard, MBF quantification approaches have been developed using several other diagnostic imaging modalities including single-photon emission computed tomography, CT, and cardiac magnetic resonance. This review will address the clinical aspects of PET MBF quantification and the new approaches to MBF quantification.

  16. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  17. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  18. Uncertainty Quantification for Airfoil Icing

    NASA Astrophysics Data System (ADS)

    DeGennaro, Anthony Matteo

    Ensuring the safety of airplane flight in icing conditions is an important and active arena of research in the aerospace community. Notwithstanding the research, development, and legislation aimed at certifying airplanes for safe operation, an analysis of the effects of icing uncertainties on certification quantities of interest is generally lacking. The central objective of this thesis is to examine and analyze problems in airfoil ice accretion from the standpoint of uncertainty quantification. We focus on three distinct areas: user-informed, data-driven, and computational uncertainty quantification. In the user-informed approach to uncertainty quantification, we discuss important canonical icing classifications and show how these categories can be modeled using a few shape parameters. We then investigate the statistical effects of these parameters. In the data-driven approach, we build statistical models of airfoil ice shapes from databases of actual ice shapes, and quantify the effects of these parameters. Finally, in the computational approach, we investigate the effects of uncertainty in the physics of the ice accretion process, by perturbing the input to an in-house numerical ice accretion code that we develop in this thesis.

  19. Hemispheric specialization in quantification processes.

    PubMed

    Pasini, M; Tessari, A

    2001-01-01

    Three experiments were carried out to study hemispheric specialization for subitizing (the rapid enumeration of small patterns) and counting (the serial quantification process based on some formal principles). The experiments consist of numerosity identification of dot patterns presented in one visual field, with a tachistoscopic technique, or eye movements monitored through glasses, and comparison between centrally presented dot patterns and lateralized tachistoscopically presented digits. Our experiments show left visual field advantage in the identification and comparison tasks in the subitizing range, whereas right visual field advantage has been found in the comparison task for the counting range.

  20. Damage quantification in confined ceramics

    SciTech Connect

    Xu Yueping; Espinosa, Horacio D.

    1998-07-10

    Impact recovery experiments on confined ceramic rods and multi-layer ceramic targets are performed for failure identification and damage quantification. In-material stress measurements with manganin gauges and velocity histories are recorded with interferometric techniques. Observations on recovered samples are made through Optical Microscopy. Microscopy results show that microcracking is the dominant failure mode in ceramic rods and multi-layer ceramic targets. Macrocrack surface per unit area is estimated on various sections along several orientations. Correlation between dynamic loading and crack density is established. Moreover, multiple penetrator defeat is observed in ceramic targets recovered from penetration experiments.

  1. Quantification of wastewater sludge dewatering.

    PubMed

    Skinner, Samuel J; Studer, Lindsay J; Dixon, David R; Hillis, Peter; Rees, Catherine A; Wall, Rachael C; Cavalida, Raul G; Usher, Shane P; Stickland, Anthony D; Scales, Peter J

    2015-10-01

    Quantification and comparison of the dewatering characteristics of fifteen sewage sludges from a range of digestion scenarios are described. The method proposed uses laboratory dewatering measurements and integrity analysis of the extracted material properties. These properties were used as inputs into a model of filtration, the output of which provides the dewatering comparison. This method is shown to be necessary for quantification and comparison of dewaterability as the permeability and compressibility of the sludges varies by up to ten orders of magnitude in the range of solids concentration of interest to industry. This causes a high sensitivity of the dewaterability comparison to the starting concentration of laboratory tests, thus simple dewaterability comparison based on parameters such as the specific resistance to filtration is difficult. The new approach is demonstrated to be robust relative to traditional methods such as specific resistance to filtration analysis and has an in-built integrity check. Comparison of the quantified dewaterability of the fifteen sludges to the relative volatile solids content showed a very strong correlation in the volatile solids range from 40 to 80%. The data indicate that the volatile solids parameter is a strong indicator of the dewatering behaviour of sewage sludges. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Detection and Quantification of Neurotransmitters in Dialysates

    PubMed Central

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2010-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection). PMID:19575473

  3. Processing and domain selection: Quantificational variability effects

    PubMed Central

    Harris, Jesse A.; Clifton, Charles; Frazier, Lyn

    2014-01-01

    Three studies investigated how readers interpret sentences with variable quantificational domains, e.g., The army was mostly in the capital, where mostly may quantify over individuals or parts (Most of the army was in the capital) or over times (The army was in the capital most of the time). It is proposed that a general conceptual economy principle, No Extra Times (Majewski 2006, in preparation), discourages the postulation of potentially unnecessary times, and thus favors the interpretation quantifying over parts. Disambiguating an ambiguously quantified sentence to a quantification over times interpretation was rated as less natural than disambiguating it to a quantification over parts interpretation (Experiment 1). In an interpretation questionnaire, sentences with similar quantificational variability were constructed so that both interpretations of the sentence would require postulating multiple times; this resulted in the elimination of the preference for a quantification over parts interpretation, suggesting the parts preference observed in Experiment 1 is not reducible to a lexical bias of the adverb mostly (Experiment 2). An eye movement recording study showed that, in the absence of prior evidence for multiple times, readers exhibit greater difficulty when reading material that forces a quantification over times interpretation than when reading material that allows a quantification over parts interpretation (Experiment 3). These experiments contribute to understanding readers’ default assumptions about the temporal properties of sentences, which is essential for understanding the selection of a domain for adverbial quantifiers and, more generally, for understanding how situational constraints influence sentence processing. PMID:25328262

  4. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  5. Error models for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Josset, L.; Scheidt, C.; Lunati, I.

    2012-12-01

    In groundwater modeling, uncertainty on the permeability field leads to a stochastic description of the aquifer system, in which the quantities of interests (e.g., groundwater fluxes or contaminant concentrations) are considered as stochastic variables and described by their probability density functions (PDF) or by a finite number of quantiles. Uncertainty quantification is often evaluated using Monte-Carlo simulations, which employ a large number of realizations. As this leads to prohibitive computational costs, techniques have to be developed to keep the problem computationally tractable. The Distance-based Kernel Method (DKM) [1] limits the computational cost of the uncertainty quantification by reducing the stochastic space: first, the realizations are clustered based on the response of a proxy; then, the full model is solved only for a subset of realizations defined by the clustering and the quantiles are estimated from this limited number of realizations. Here, we present a slightly different strategy that employs an approximate model rather than a proxy: we use the Multiscale Finite Volume method (MsFV) [2,3] to compute an approximate solution for each realization, and to obtain a first assessment of the PDF. In this context, DKM is then used to identify a subset of realizations for which the exact model is solved and compared with the solution of the approximate model. This allows highlighting and correcting possible errors introduced by the approximate model, while keeping full statistical information on the ensemble of realizations. Here, we test several strategies to compute the model error, correct the approximate model and achieve an optimal PDF estimation. We present a case study in which we predict the breakthrough curve of an ideal tracer for an ensemble of realizations generated via Multiple Point Direct Sampling [4] with a training image obtained from a 2D section of the Herten permeability field [5]. [1] C. Scheidt and J. Caers, "Representing

  6. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  7. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA.

  8. MAMA Software Features: Visual Examples of Quantification

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-20

    This document shows examples of the results from quantifying objects of certain sizes and types in the software. It is intended to give users a better feel for some of the quantification calculations, and, more importantly, to help users understand the challenges with using a small set of ‘shape’ quantification calculations for objects that can vary widely in shapes and features. We will add more examples to this in the coming year.

  9. Accessible quantification of multiparticle entanglement

    NASA Astrophysics Data System (ADS)

    Cianciaruso, Marco; Bromley, Thomas R.; Adesso, Gerardo

    2016-10-01

    Entanglement is a key ingredient for quantum technologies and a fundamental signature of quantumness in a broad range of phenomena encompassing many-body physics, thermodynamics, cosmology and life sciences. For arbitrary multiparticle systems, entanglement quantification typically involves nontrivial optimisation problems, and it may require demanding tomographical techniques. Here, we develop an experimentally feasible approach to the evaluation of geometric measures of multiparticle entanglement. Our framework provides analytical results for particular classes of mixed states of N qubits, and computable lower bounds to global, partial, or genuine multiparticle entanglement of any general state. For global and partial entanglement, useful bounds are obtained with minimum effort, requiring local measurements in just three settings for any N. For genuine entanglement, a number of measurements scaling linearly with N are required. We demonstrate the power of our approach to estimate and quantify different types of multiparticle entanglement in a variety of N-qubit states useful for quantum information processing and recently engineered in laboratories with quantum optics and trapped ion setups.

  10. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  11. Precise quantification of nanoparticle internalization.

    PubMed

    Gottstein, Claudia; Wu, Guohui; Wong, Benjamin J; Zasadzinski, Joseph Anthony

    2013-06-25

    Nanoparticles have opened new exciting avenues for both diagnostic and therapeutic applications in human disease, and targeted nanoparticles are increasingly used as specific drug delivery vehicles. The precise quantification of nanoparticle internalization is of importance to measure the impact of physical and chemical properties on the uptake of nanoparticles into target cells or into cells responsible for rapid clearance. Internalization of nanoparticles has been measured by various techniques, but comparability of data between different laboratories is impeded by lack of a generally accepted standardized assay. Furthermore, the distinction between associated and internalized particles has been a challenge for many years, although this distinction is critical for most research questions. Previously used methods to verify intracellular location are typically not quantitative and do not lend themselves to high-throughput analysis. Here, we developed a mathematical model which integrates the data from high-throughput flow cytometry measurements with data from quantitative confocal microscopy. The generic method described here will be a useful tool in biomedical nanotechnology studies. The method was then applied to measure the impact of surface coatings of vesosomes on their internalization by cells of the reticuloendothelial system (RES). RES cells are responsible for rapid clearance of nanoparticles, and the resulting fast blood clearance is one of the major challenges in biomedical applications of nanoparticles. Coating of vesosomes with long chain polyethylene glycol showed a trend for lower internalization by RES cells.

  12. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  13. Uncertainty quantification for environmental models

    USGS Publications Warehouse

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  14. Uncertainty quantification in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Rizzi, Francesco

    This dissertation focuses on uncertainty quantification (UQ) in molecular dynamics (MD) simulations. The application of UQ to molecular dynamics is motivated by the broad uncertainty characterizing MD potential functions and by the complexity of the MD setting, where even small uncertainties can be amplified to yield large uncertainties in the model predictions. Two fundamental, distinct sources of uncertainty are investigated in this work, namely parametric uncertainty and intrinsic noise. Intrinsic noise is inherently present in the MD setting, due to fluctuations originating from thermal effects. Averaging methods can be exploited to reduce the fluctuations, but due to finite sampling, this effect cannot be completely filtered, thus yielding a residual uncertainty in the MD predictions. Parametric uncertainty, on the contrary, is introduced in the form of uncertain potential parameters, geometry, and/or boundary conditions. We address the UQ problem in both its main components, namely the forward propagation, which aims at characterizing how uncertainty in model parameters affects selected observables, and the inverse problem, which involves the estimation of target model parameters based on a set of observations. The dissertation highlights the challenges arising when parametric uncertainty and intrinsic noise combine to yield non-deterministic, noisy MD predictions of target macroscale observables. Two key probabilistic UQ methods, namely Polynomial Chaos (PC) expansions and Bayesian inference, are exploited to develop a framework that enables one to isolate the impact of parametric uncertainty on the MD predictions and, at the same time, properly quantify the effect of the intrinsic noise. Systematic applications to a suite of problems of increasing complexity lead to the observation that an uncertain PC representation built via Bayesian regression is the most suitable model for the representation of uncertain MD predictions of target observables in the

  15. Separation and quantification of microalgal carbohydrates.

    PubMed

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse.

  16. Carotid intraplaque neovascularization quantification software (CINQS).

    PubMed

    Akkus, Zeynettin; van Burken, Gerard; van den Oord, Stijn C H; Schinkel, Arend F L; de Jong, Nico; van der Steen, Antonius F W; Bosch, Johan G

    2015-01-01

    Intraplaque neovascularization (IPN) is an important biomarker of atherosclerotic plaque vulnerability. As IPN can be detected by contrast enhanced ultrasound (CEUS), imaging-biomarkers derived from CEUS may allow early prediction of plaque vulnerability. To select the best quantitative imaging-biomarkers for prediction of plaque vulnerability, a systematic analysis of IPN with existing and new analysis algorithms is necessary. Currently available commercial contrast quantification tools are not applicable for quantitative analysis of carotid IPN due to substantial motion of the carotid artery, artifacts, and intermittent perfusion of plaques. We therefore developed a specialized software package called Carotid intraplaque neovascularization quantification software (CINQS). It was designed for effective and systematic comparison of sets of quantitative imaging biomarkers. CINQS includes several analysis algorithms for carotid IPN quantification and overcomes the limitations of current contrast quantification tools and existing carotid IPN quantification approaches. CINQS has a modular design which allows integrating new analysis tools. Wizard-like analysis tools and its graphical-user-interface facilitate its usage. In this paper, we describe the concept, analysis tools, and performance of CINQS and present analysis results of 45 plaques of 23 patients. The results in 45 plaques showed excellent agreement with visual IPN scores for two quantitative imaging-biomarkers (The area under the receiver operating characteristic curve was 0.92 and 0.93).

  17. Tumor Quantification in Clinical Positron Emission Tomography

    PubMed Central

    Bai, Bing; Bading, James; Conti, Peter S

    2013-01-01

    Positron emission tomography (PET) is used extensively in clinical oncology for tumor detection, staging and therapy response assessment. Quantitative measurements of tumor uptake, usually in the form of standardized uptake values (SUVs), have enhanced or replaced qualitative interpretation. In this paper we review the current status of tumor quantification methods and their applications to clinical oncology. Factors that impede quantitative assessment and limit its accuracy and reproducibility are summarized, with special emphasis on SUV analysis. We describe current efforts to improve the accuracy of tumor uptake measurements, characterize overall metabolic tumor burden and heterogeneity of tumor uptake, and account for the effects of image noise. We also summarize recent developments in PET instrumentation and image reconstruction and their impact on tumor quantification. Finally, we offer our assessment of the current development needs in PET tumor quantification, including practical techniques for fully quantitative, pharmacokinetic measurements. PMID:24312151

  18. Limitations and challenges of genetic barcode quantification

    PubMed Central

    Thielecke, Lars; Aranyossy, Tim; Dahl, Andreas; Tiwari, Rajiv; Roeder, Ingo; Geiger, Hartmut; Fehse, Boris; Glauche, Ingmar; Cornils, Kerstin

    2017-01-01

    Genetic barcodes are increasingly used to track individual cells and to quantitatively assess their clonal contributions over time. Although barcode quantification relies entirely on counting sequencing reads, detailed studies about the method’s accuracy are still limited. We report on a systematic investigation of the relation between barcode abundance and resulting read counts after amplification and sequencing using cell-mixtures that contain barcodes with known frequencies (“miniBulks”). We evaluated the influence of protocol modifications to identify potential sources of error and elucidate possible limitations of the quantification approach. Based on these findings we designed an advanced barcode construct (BC32) to improved barcode calling and quantification, and to ensure a sensitive detection of even highly diluted barcodes. Our results emphasize the importance of using curated barcode libraries to obtain interpretable quantitative data and underline the need for rigorous analyses of any utilized barcode library in terms of reliability and reproducibility. PMID:28256524

  19. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  20. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  1. Quantification of fluorescent reporters in plant cells.

    PubMed

    Pound, Michael; French, Andrew P; Wells, Darren M

    2015-01-01

    Fluorescent reporters are powerful tools for plant research. Many studies require accurate determination of fluorescence intensity and localization. Here, we describe protocols for the quantification of fluorescence intensity in plant cells from confocal laser scanning microscope images using semiautomated software and image analysis techniques.

  2. Cues, quantification, and agreement in language comprehension.

    PubMed

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  3. DOSCATs: Double standards for protein quantification

    PubMed Central

    Bennett, Richard J.; Simpson, Deborah M.; Holman, Stephen W.; Ryan, Sheila; Brownridge, Philip; Eyers, Claire E.; Colyer, John; Beynon, Robert J.

    2017-01-01

    The two most common techniques for absolute protein quantification are based on either mass spectrometry (MS) or on immunochemical techniques, such as western blotting (WB). Western blotting is most often used for protein identification or relative quantification, but can also be deployed for absolute quantification if appropriate calibration standards are used. MS based techniques offer superior data quality and reproducibility, but WB offers greater sensitivity and accessibility to most researchers. It would be advantageous to apply both techniques for orthogonal quantification, but workflows rarely overlap. We describe DOSCATs (DOuble Standard conCATamers), novel calibration standards based on QconCAT technology, to unite these platforms. DOSCATs combine a series of epitope sequences concatenated with tryptic peptides in a single artificial protein to create internal tryptic peptide standards for MS as well as an intact protein bearing multiple linear epitopes. A DOSCAT protein was designed and constructed to quantify five proteins of the NF-κB pathway. For three target proteins, protein fold change and absolute copy per cell values measured by MS and WB were in excellent agreement. This demonstrates that DOSCATs can be used as multiplexed, dual purpose standards, readily deployed in a single workflow, supporting seamless quantitative transition from MS to WB. PMID:28368040

  4. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  5. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  6. Uncertainty quantification for porous media flows

    SciTech Connect

    Christie, Mike . E-mail: mike.christie@pet.hw.ac.uk; Demyanov, Vasily; Erbas, Demet

    2006-09-01

    Uncertainty quantification is an increasingly important aspect of many areas of computational science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of oil and water through oil reservoirs is an example of a complex system where accuracy in prediction is needed primarily for financial reasons. Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks. This paper examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed data. Machine learning algorithms are used to speed up the identification of regions in parameter space where good matches to observed data can be found.

  7. Whitepaper on Uncertainty Quantification for MPACT

    SciTech Connect

    Williams, Mark L.

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However, MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  8. Quantification of Amyloid Precursor Protein Isoforms Using Quantification Concatamer Internal Standard

    PubMed Central

    Chen, Junjun; Wang, Meiyao; Turko, Illarion V.

    2014-01-01

    It is likely that expression and/or post-translational generation of various protein isoforms can be indicative of initial pathological changes or pathology development. However, selective quantification of individual protein isoforms remains a challenge, because they simultaneously possess common and unique amino acid sequences. Quantification concatamer (QconCAT) internal standards were originally designed for a large-scale proteome quantification and are artificial proteins that are concatamers of tryptic peptides for several proteins. We developed a QconCAT for quantification of various isoforms of amyloid precursor protein (APP). APP-QconCAT includes tryptic peptides that are common for all isoforms of APP concatenated with those tryptic peptides that are unique for specific APP isoforms. Isotope-labeled APP-QconCAT was expressed, purified, characterized, and further used for quantification of total APP, APP695, and amyloid-β (Aβ) in the human frontal cortex from control and severe Alzheimer’s disease donors. Potential biological implications of our quantitative measurements are discussed. It is also expected that using APP-QconCAT(s) will advance our understanding of biological mechanism by which various APP isoforms involved in the pathogenesis of Alzheimer’s disease. PMID:23186391

  9. Quantification of amyloid precursor protein isoforms using quantification concatamer internal standard.

    PubMed

    Chen, Junjun; Wang, Meiyao; Turko, Illarion V

    2013-01-02

    It is likely that expression and/or post-translational generation of various protein isoforms can be indicative of initial pathological changes or pathology development. However, selective quantification of individual protein isoforms remains a challenge, because they simultaneously possess common and unique amino acid sequences. Quantification concatamer (QconCAT) internal standards were originally designed for a large-scale proteome quantification and are artificial proteins that are concatamers of tryptic peptides for several proteins. We developed a QconCAT for quantification of various isoforms of amyloid precursor protein (APP). APP-QconCAT includes tryptic peptides that are common for all isoforms of APP concatenated with those tryptic peptides that are unique for specific APP isoforms. Isotope-labeled APP-QconCAT was expressed, purified, characterized, and further used for quantification of total APP, APP695, and amyloid-β (Aβ) in the human frontal cortex from control and severe Alzheimer's disease donors. Potential biological implications of our quantitative measurements are discussed. It is also expected that using APP-QconCAT(s) will advance our understanding of biological mechanism by which various APP isoforms involved in the pathogenesis of Alzheimer's disease.

  10. Phylogenetic Quantification of Intra-tumour Heterogeneity

    PubMed Central

    Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian

    2014-01-01

    Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184

  11. Advances in forensic DNA quantification: a review.

    PubMed

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification.

  12. Protein quantification using a cleavable reporter peptide.

    PubMed

    Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno

    2015-02-06

    Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.

  13. Precise protein quantification based on peptide quantification using iTRAQ™

    PubMed Central

    Boehm, Andreas M; Pütz, Stephanie; Altenhöfer, Daniela; Sickmann, Albert; Falk, Michael

    2007-01-01

    Background Mass spectrometry based quantification of peptides can be performed using the iTRAQ™ reagent in conjunction with mass spectrometry. This technology yields information about the relative abundance of single peptides. A method for the calculation of reliable quantification information is required in order to obtain biologically relevant data at the protein expression level. Results A method comprising sound error estimation and statistical methods is presented that allows precise abundance analysis plus error calculation at the peptide as well as at the protein level. This yields the relevant information that is required for quantitative proteomics. Comparing the performance of our method named Quant with existing approaches the error estimation is reliable and offers information for precise bioinformatic models. Quant is shown to generate results that are consistent with those produced by ProQuant™, thus validating both systems. Moreover, the results are consistent with that of Mascot™ 2.2. The MATLAB® scripts of Quant are freely available via and , each under the GNU Lesser General Public License. Conclusion The software Quant demonstrates improvements in protein quantification using iTRAQ™. Precise quantification data can be obtained at the protein level when using error propagation and adequate visualization. Quant integrates both and additionally provides the possibility to obtain more reliable results by calculation of wise quality measures. Peak area integration has been replaced by sum of intensities, yielding more reliable quantification results. Additionally, Quant allows the combination of quantitative information obtained by iTRAQ™ with peptide and protein identifications from popular tandem MS identification tools. Hence Quant is a useful tool for the proteomics community and may help improving analysis of proteomic experimental data. In addition, we have shown that a lognormal distribution fits the data of mass spectrometry based

  14. Spatialised fate factors for nitrate in catchments: modelling approach and implication for LCA results.

    PubMed

    Basset-Mens, Claudine; Anibar, Lamiaa; Durand, Patrick; van der Werf, Hayo M G

    2006-08-15

    The challenge for environmental assessment tools, such as Life Cycle Assessment (LCA) is to provide a holistic picture of the environmental impacts of a given system, while being relevant both at a global scale, i.e., for global impact categories such as climate change, and at a smaller scale, i.e., for regional impact categories such as aquatic eutrophication. To this end, the environmental mechanisms between emission and impact should be taken into account. For eutrophication in particular, which is one of the main impacts of farming systems, the fate factor of eutrophying pollutants in catchments, and particularly of nitrate, reflects one of these important and complex environmental mechanisms. We define this fate factor as: the ratio of the amount of nitrate at the outlet of the catchment over the nitrate emitted from the catchment's soils. In LCA, this fate factor is most often assumed equal to 1, while the observed fate factor is generally less than 1. A generic approach for estimating the range of variation of nitrate fate factors in a region of intensive agriculture was proposed. This approach was based on the analysis of different catchment scenarios combining different catchment types and different effective rainfalls. The evolution over time of the nitrate fate factor as well as the steady state fate factor for each catchment scenario was obtained using the INCA simulation model. In line with the general LCA model, the implications of the steady state fate factors for nitrate were investigated for the eutrophication impact result in the framework of an LCA of pig production. A sensitivity analysis to the fraction of nitrate lost as N(2)O was presented for the climate change impact category. This study highlighted the difference between the observed fate factor at a given time, which aggregates both storage and transformation processes and a "steady state fate factor", specific to the system considered. The range of steady state fate factors obtained for the study region was wide, from 0.44 to 0.86, depending primarily on the catchment type and secondarily on the effective rainfall. The sensitivity of the LCA of pig production to the fate factors was significant concerning eutrophication, but potentially much larger concerning climate change. The potential for producing improved eutrophication results by using spatially differentiated fate factors was demonstrated. Additionally, the urgent need for quantitative studies on the N(2)O/N(2) ratio in riparian zones denitrification was highlighted.

  15. Spatialising the contentious politics of ADHD: networks and scalar strategies in health social movement activism.

    PubMed

    Edwards, Claire

    2014-09-01

    This paper explores the spatial dynamics of health social movement activism in the context of a specific condition, Attention Deficit Hyperactivity Disorder (ADHD). Deploying qualitative research conducted with Irish ADHD organisations, it examines how place and space affect activist networks and the dilemmas that emerge when local 'mobilisations' converge at national and transnational levels. ADHD activism in Ireland has been predominantly localist in orientation, but certain organisations have shifted their activism to the European scale as a means of gaining further political and epistemic recognition for the condition. The paper suggests that health social movement studies would benefit from an engagement with the geographies of inter-scalar relations in analysing organisations׳ action repertoires.

  16. The pitfalls of protein quantification in wastewater treatment studies.

    PubMed

    Avella, A C; Görner, T; de Donato, Ph

    2010-09-15

    Proteins, as one of the principal components of organic matter in wastewater, require adequate quantification to determine their concentration in the different stages of wastewater treatment process. Recent studies have used the corrected Lowry Method for protein quantification arguing that this method can differentiate proteins from interfering humic substances. In this study, the classic Lowry Method, the corrected Lowry Method and a commercial assay kit were assessed for the protein quantification in the presence of humic acid.

  17. Interference microscopes for tribology and corrosion quantification

    NASA Astrophysics Data System (ADS)

    Novak, Erik; Blewett, Nelson; Stout, Tom

    2007-06-01

    Interference microscopes remain one of the most accurate, repeatable, and versatile metrology systems for precision surface measurements. Such systems successfully measure material in both research labs and production lines in micro-optics, MEMS, data storage, medical device, and precision machining industries to sub-nanometer vertical resolution. Increasingly, however, these systems are finding uses outside of traditional surface-measurement applications, including film thickness determination, environmental responses of material, and determination of behavior under actuation. Most recently, these systems are enabling users to examine behavior of materials over varying time-scales as they are used in cutting or grinding operations or where the material is merely in continual contact with another such as in medical implants. In particular, quantification of wear of surfaces with varying coatings and under different conditions is of increasing value as tolerances decrease and consistency in final products is more valuable. Also, response of materials in corrosive environments allows users to quantify the gains of varying surface treatments against the cost of those treatments. Such quantification requires novel hardware and software for the system to ensure results are fast, accurate, and relevant. In this paper we explore three typical applications in tribology and corrosion. Deterioration of the cutting surfaces on a multi-blade razor is explored, with quantification of key surface features. Next, wear of several differently coated drill bits under similar use conditions is examined. Thirdly, in situ measurement of corrosion of several metal surfaces in harsh environmental conditions is performed. These case studies highlight how standard interference microscopes are evolving to serve novel industrial applications.

  18. Stereo-particle image velocimetry uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  19. Simultaneous quantification of multiple magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Rauwerdink, Adam M.; Giustini, Andrew J.; Weaver, John B.

    2010-11-01

    Distinct magnetic nanoparticle designs can have unique spectral responses to an AC magnetic field in a technique called the magnetic spectroscopy of Brownian motion (MSB). The spectra of the particles have been measured using desktop spectrometers and in vivo measurements. If multiple particle types are present in a region of interest, the unique spectral signatures allow for the simultaneous quantification of the various particles. We demonstrate such a potential experimentally with up to three particle types. This ability to concurrently detect multiple particles will enable new biomedical applications.

  20. Quantification and analysis of intramolecular interactions.

    PubMed

    Gonthier, Jérôme F; Corminboeuf, Clémence

    2014-01-01

    Non-covalent interactions play a prominent role in chemistry and biology. While a myriad of theoretical methods have been devised to quantify and analyze intermolecular interactions, the theoretical toolbox for the intramolecular analogues is much scarcer. Yet interactions within molecules govern fundamental phenomena as illustrated by the energetic differences between structural isomers. Their accurate quantification is of utmost importance. This paper gives an overview of the most common approaches able to probe intramolecular interactions and stresses both their characteristics and limitations. We finally introduce our recent theoretical approach, which represents the first step towards the development of an intramolecular version of Symmetry-Adapted Perturbation Theory (SAPT).

  1. Adjoint-Based Uncertainty Quantification with MCNP

    NASA Astrophysics Data System (ADS)

    Seifried, Jeffrey Edwin

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  2. NMR-based quantification of organic diphosphates

    PubMed Central

    Lenevich, Stepan

    2010-01-01

    Phosphorylated compounds are ubiquitous in life. Given their central role, many such substrates and analogues have been prepared for subsequent evaluation. Prior to biological experiments, it is typically necessary to determine the concentration of the target molecule in solution. Here we describe a method where concentrations of stock solutions of organic diphosphates and bisphosphonates are quantified using 31P NMR spectroscopy with standard instrumentation using a capillary tube with a secondary standard. The method is specific and is applicable down to a concentration of 200 μM. The capillary tube provides the reference peak for quantification and deuterated solvent for locking. PMID:20833124

  3. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  4. QUANTIFICATION OF TISSUE PROPERTIES IN SMALL VOLUMES

    SciTech Connect

    J. MOURANT; ET AL

    2000-12-01

    The quantification of tissue properties by optical measurements will facilitate the development of noninvasive methods of cancer diagnosis and detection. Optical measurements are sensitive to tissue structure which is known to change during tumorigenesis. The goals of the work presented in this paper were to verify that the primary scatterers of light in cells are structures much smaller than the nucleus and then to develop an optical technique that can quantify parameters of structures the same size as the scattering features in cells. Polarized, elastic back-scattering was found to be able to quantify changes in scattering properties for turbid media consisting of scatterers of the size found in tissue.

  5. Tutorial examples for uncertainty quantification methods.

    SciTech Connect

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  6. Quantification of Detergents Complexed with Membrane Proteins

    PubMed Central

    Chaptal, Vincent; Delolme, Frédéric; Kilburg, Arnaud; Magnard, Sandrine; Montigny, Cédric; Picard, Martin; Prier, Charlène; Monticelli, Luca; Bornert, Olivier; Agez, Morgane; Ravaud, Stéphanie; Orelle, Cédric; Wagner, Renaud; Jawhari, Anass; Broutin, Isabelle; Pebay-Peyroula, Eva; Jault, Jean-Michel; Kaback, H. Ronald; le Maire, Marc; Falson, Pierre

    2017-01-01

    Most membrane proteins studies require the use of detergents, but because of the lack of a general, accurate and rapid method to quantify them, many uncertainties remain that hamper proper functional and structural data analyses. To solve this problem, we propose a method based on matrix-assisted laser desorption/ionization mass spectrometry (MALDI-TOF MS) that allows quantification of pure or mixed detergents in complex with membrane proteins. We validated the method with a wide variety of detergents and membrane proteins. We automated the process, thereby allowing routine quantification for a broad spectrum of usage. As a first illustration, we show how to obtain information of the amount of detergent in complex with a membrane protein, essential for liposome or nanodiscs reconstitutions. Thanks to the method, we also show how to reliably and easily estimate the detergent corona diameter and select the smallest size, critical for favoring protein-protein contacts and triggering/promoting membrane protein crystallization, and to visualize the detergent belt for Cryo-EM studies. PMID:28176812

  7. Virus detection and quantification using electrical parameters

    NASA Astrophysics Data System (ADS)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  8. Virus detection and quantification using electrical parameters

    PubMed Central

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-01-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles. PMID:25355078

  9. Quantification of abdominal aortic deformation after EVAR

    NASA Astrophysics Data System (ADS)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  10. Quantification of ontogenetic allometry in ammonoids.

    PubMed

    Korn, Dieter

    2012-01-01

    Ammonoids are well-known objects used for studies on ontogeny and phylogeny, but a quantification of ontogenetic change has not yet been carried out. Their planispirally coiled conchs allow for a study of "longitudinal" ontogenetic data, that is data of ontogenetic trajectories that can be obtained from a single specimen. Therefore, they provide a good model for ontogenetic studies of geometry in other shelled organisms. Using modifications of three cardinal conch dimensions, computer simulations can model artificial conchs. The trajectories of ontogenetic allometry of these simulations can be analyzed in great detail in a theoretical morphospace. A method for the classification of conch ontogeny and quantification of the degree of allometry is proposed. Using high-precision cross-sections, the allometric conch growth of real ammonoids can be documented and compared. The members of the Ammonoidea show a wide variety of allometric growth, ranging from near isometry to monophasic, biphasic, or polyphasic allometry. Selected examples of Palaeozoic and Mesozoic ammonoids are shown with respect to their degree of change during ontogeny of the conch.

  11. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Quantification noise in single cell experiments

    PubMed Central

    Reiter, M.; Kirchner, B.; Müller, H.; Holzhauer, C.; Mann, W.; Pfaffl, M. W.

    2011-01-01

    In quantitative single-cell studies, the critical part is the low amount of nucleic acids present and the resulting experimental variations. In addition biological data obtained from heterogeneous tissue are not reflecting the expression behaviour of every single-cell. These variations can be derived from natural biological variance or can be introduced externally. Both have negative effects on the quantification result. The aim of this study is to make quantitative single-cell studies more transparent and reliable in order to fulfil the MIQE guidelines at the single-cell level. The technical variability introduced by RT, pre-amplification, evaporation, biological material and qPCR itself was evaluated by using RNA or DNA standards. Secondly, the biological expression variances of GAPDH, TNFα, IL-1β, TLR4 were measured by mRNA profiling experiment in single lymphocytes. The used quantification setup was sensitive enough to detect single standard copies and transcripts out of one solitary cell. Most variability was introduced by RT, followed by evaporation, and pre-amplification. The qPCR analysis and the biological matrix introduced only minor variability. Both conducted studies impressively demonstrate the heterogeneity of expression patterns in individual cells and showed clearly today's limitation in quantitative single-cell expression analysis. PMID:21745823

  13. Detection and quantification of levoglucosan in atmospheric aerosols: a review.

    PubMed

    Schkolnik, Gal; Rudich, Yinon

    2006-05-01

    Levoglucosan is a tracer for biomass burning sources in atmospheric aerosol particles. Therefore, much effort has been recently put into developing methods for its quantification. This review describes and compares both established and emerging analytical methods for levoglucosan quantification in ambient aerosol samples, with the special needs of the environmental analytical chemist in mind.

  14. Exploring Potential ADS-B Vulnerabilites in the FAA’s Nextgen Air Transportation System

    DTIC Science & Technology

    2011-06-01

    the reliance on aviation for transportation of people and goods, even a temporary loss could have devastating impacts. Consider the recent Iceland...Ground Station Target Ghost Inject, with the exception that the target for the attack is an aircraft. Because there is no data correlation like that...station for message injection generally proves most difficult. This is primarily due to expected data correlation at a ground station that would not

  15. Integration der bodenkundlichen Filter- und Pufferfunktion in die hydrogeologische Vulnerabilitätsbewertung

    NASA Astrophysics Data System (ADS)

    Wirsing, Tobias; Neukum, Christoph; Goldscheider, Nico; Maier, Matthias

    2015-06-01

    Vulnerability maps are standard tools for the assessment of groundwater sensitivity to contamination. Due to their increased use in technical guidelines, vulnerability maps have become state-of-the-art tools in resource management. However, most approaches have been developed by hydrogeologists and soil scientists who incorporate the understanding of processes from their specific disciplines very well but have limitations in considering processes in other disciplines. A soil-specific database for vulnerability assessment has been significantly improved by soil scientists over the past several years and includes quality, spatial extension and availability. Hence, it is time to integrate this database into hydrogeological concepts. This work presents a vulnerability mapping approach that considers a new soil database that has been available since 2014 for the entire Baden-Württemberg region at a scale of 1:50.000, adapting the well-established GLA and PI methods. Due to the newly-developed classification scheme for the protective function, this approach provides a more balanced and meaningful classification. This leads to a distinct image of the study area and a better interpretation of vulnerability.

  16. Erratum: Erratum zu: Integration der bodenkundlichen Filter- und Pufferfunktion in die hydrogeologische Vulnerabilitätsbewertung

    NASA Astrophysics Data System (ADS)

    Wirsing, Tobias; Neukum, Christoph; Goldscheider, Nico; Maier, Matthias

    2015-09-01

    Vulnerability maps are standard tools for the assessment of groundwater sensitivity to contamination. Due to their increased use in technical guidelines, vulnerability maps have become state-of-the-art tools in resource management. However, most approaches have been developed by hydrogeologists and soil scientists who incorporate the understanding of processes from their specific disciplines very well but have limitations in considering processes in other disciplines. A soil-specific database for vulnerability assessment has been significantly improved by soil scientists over the past several years and includes quality, spatial extension and availability. Hence, it is time to integrate this database into hydrogeological concepts. This work presents a vulnerability mapping approach that considers a new soil database that has been available since 2014 for the entire Baden-Württemberg region at a scale of 1:50.000, adapting the well-established GLA and PI methods. Due to the newly-developed classification scheme for the protective function, this approach provides a more balanced and meaningful classification. This leads to a distinct image of the study area and a better interpretation of vulnerability.

  17. In vivo MRS metabolite quantification using genetic optimization

    NASA Astrophysics Data System (ADS)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  18. Survey and Evaluate Uncertainty Quantification Methodologies

    SciTech Connect

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  19. Uncertainty quantification in DIC with Kriging regression

    NASA Astrophysics Data System (ADS)

    Wang, Dezhi; DiazDelaO, F. A.; Wang, Weizhuo; Lin, Xiaoshan; Patterson, Eann A.; Mottershead, John E.

    2016-03-01

    A Kriging regression model is developed as a post-processing technique for the treatment of measurement uncertainty in classical subset-based Digital Image Correlation (DIC). Regression is achieved by regularising the sample-point correlation matrix using a local, subset-based, assessment of the measurement error with assumed statistical normality and based on the Sum of Squared Differences (SSD) criterion. This leads to a Kriging-regression model in the form of a Gaussian process representing uncertainty on the Kriging estimate of the measured displacement field. The method is demonstrated using numerical and experimental examples. Kriging estimates of displacement fields are shown to be in excellent agreement with 'true' values for the numerical cases and in the experimental example uncertainty quantification is carried out using the Gaussian random process that forms part of the Kriging model. The root mean square error (RMSE) on the estimated displacements is produced and standard deviations on local strain estimates are determined.

  20. Quantification of Osteon Morphology Using Geometric Histomorphometrics.

    PubMed

    Dillon, Scott; Cunningham, Craig; Felts, Paul

    2016-03-01

    Many histological methods in forensic anthropology utilize combinations of traditional histomorphometric parameters which may not accurately describe the morphology of microstructural features. Here, we report the novel application of a geometric morphometric method suitable when considering structures without anatomically homologous landmarks for the quantification of complete secondary osteon size and morphology. The method is tested for its suitability in the measurement of intact secondary osteons using osteons digitized from transverse femoral diaphyseal sections prepared from two human individuals. The results of methodological testing demonstrate the efficacy of the technique when applied to intact secondary osteons. In providing accurate characterization of micromorphology within the robust mathematical framework of geometric morphometrics, this method may surpass traditional histomorphometric variables currently employed in forensic research and practice. A preliminary study of the intersectional histomorphometric variation within the femoral diaphysis is made using this geometric histomorphometric method to demonstrate its potential. © 2015 American Academy of Forensic Sciences.

  1. Uncertainty quantification of acoustic emission filtering techniques

    NASA Astrophysics Data System (ADS)

    Zárate, Boris A.; Caicedo, Juan M.; Ziehl, Paul

    2012-04-01

    This paper compares six different filtering protocols used in Acoustic Emission (AE) monitoring of fatigue crack growth. The filtering protocols are combination of three different filtering techniques which are based on Swansong-like filters and load filters. The filters are compared deterministically and probabilistically. The deterministic comparison is based on the coefficient of determination of the resulting AE data, while the probabilistic comparison is based on the quantification of the uncertainty of the different filtering protocols. The uncertainty of the filtering protocols is quantified by calculating the entropy of the probability distribution of some AE and fracture mechanics parameters for the given filtering protocol. The methodology is useful in cases where several filtering protocols are available and there is no reason to choose one over the others. Acoustic Emission data from a compact tension specimen tested under cyclic load is used for the comparison.

  2. Quantification of adipose tissue insulin sensitivity.

    PubMed

    Søndergaard, Esben; Jensen, Michael D

    2016-06-01

    In metabolically healthy humans, adipose tissue is exquisitely sensitive to insulin. Similar to muscle and liver, adipose tissue lipolysis is insulin resistant in adults with central obesity and type 2 diabetes. Perhaps uniquely, however, insulin resistance in adipose tissue may directly contribute to development of insulin resistance in muscle and liver because of the increased delivery of free fatty acids to those tissues. It has been hypothesized that insulin adipose tissue resistance may precede other metabolic defects in obesity and type 2 diabetes. Therefore, precise and reproducible quantification of adipose tissue insulin sensitivity, in vivo, in humans, is an important measure. Unfortunately, no consensus exists on how to determine adipose tissue insulin sensitivity. We review the methods available to quantitate adipose tissue insulin sensitivity and will discuss their strengths and weaknesses. Copyright © 2016 American Federation for Medical Research.

  3. Feature isolation and quantification of evolving datasets

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Identifying and isolating features is an important part of visualization and a crucial step for the analysis and understanding of large time-dependent data sets (either from observation or simulation). In this proposal, we address these concerns, namely the investigation and implementation of basic 2D and 3D feature based methods to enhance current visualization techniques and provide the building blocks for automatic feature recognition, tracking, and correlation. These methods incorporate ideas from scientific visualization, computer vision, image processing, and mathematical morphology. Our focus is in the area of fluid dynamics, and we show the applicability of these methods to the quantification and tracking of three-dimensional vortex and turbulence bursts.

  4. Usage of human reliability quantification methods.

    PubMed

    Grozdanovic, Miroljub

    2005-01-01

    Human reliability quantification (HRQ) methods are becoming increasingly important in risk and accident assessment in systems these terms are usually related to (hi-tech industrial systems, including nuclear and chemical plants). These methods began to intensively develop after numerous accidents caused by human error or inadequate activity of people who controlled and managed complex technological processes. For already existing systems, but also for new ones, it is important to assess the possibility of an accident. Determination of possible preventive activities, which include the influence of human error on the safety of a system, is also required. These are the main goals of the HRQ method. Using Absolute Probability Judgment (APJ) and Success Likelihood Index Methods (SLIM) HRQ techniques in control and management centers in electro-power systems in Belgrade and railway traffic in Nis (both in Serbia and Montenegro) are shown in this paper.

  5. Human cytomegalovirus: propagation, quantification, and storage.

    PubMed

    Britt, William J

    2010-08-01

    Human cytomegalovirus (HCMV) is the largest and perhaps the most structurally complex member of the family of human herpesviruses. It is the prototypic virus of the beta-herpesvirus subfamily. As with other cytomegaloviruses, HCMV is exquisitely species specific and undergoes lytic replication only in cells of human origin. In addition, its replication is limited almost entirely to primary cells and a limited number of transformed cell lines. Together with its prolonged replicative cycle of approximately 48 hr, the propagation and quantification of HCMV can present technical challenges. In this brief set of protocols, the propagation of laboratory strains of HCMV and their quantitation is described. In a third series of protocols, the concentration and gradient purification of HCMV for more specialized downstream applications is described.

  6. Recurrence quantification analysis of global stock markets

    NASA Astrophysics Data System (ADS)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  7. Quantification of variability in trichome patterns

    PubMed Central

    Greese, Bettina; Hülskamp, Martin; Fleck, Christian

    2014-01-01

    While pattern formation is studied in various areas of biology, little is known about the noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to, e.g., the abundance of cell components or environmental conditions. To elevate the understanding of regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches toward characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability. PMID:25431575

  8. Quantification of variability in trichome patterns.

    PubMed

    Greese, Bettina; Hülskamp, Martin; Fleck, Christian

    2014-01-01

    While pattern formation is studied in various areas of biology, little is known about the noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to, e.g., the abundance of cell components or environmental conditions. To elevate the understanding of regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches toward characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability.

  9. Poliovirus: Generation, Quantification, Propagation, Purification, and Storage

    PubMed Central

    Burrill, Cecily P.; Strings, Vanessa R.; Andino, Raul

    2016-01-01

    Poliovirus (PV) is the prototypical picornavirus. It is a non-enveloped RNA virus with a small (~7.5 kb) genome of positive polarity. It has long served as a model to study RNA virus biology, pathogenesis, and evolution. cDNA clones of several strains are available, and infectious virus can be produced by the transfection of in vitro transcribed viral genomes into an appropriate host cell. PV infects many human and non-human primate cell lines including HeLa and HeLa S3 cells, and can grow to high titer in culture. Protocols for the production, propagation, quantification, and purification of PV are presented. A separate chapter concerning the generation and characterization of PV mutants will also be presented. PMID:23686830

  10. Uncertainty Quantification of Equilibrium Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Brandon, S. T.; Covey, C. C.; Domyancic, D. M.; Johannesson, G.; Klein, R.; Tannahill, J.; Zhang, Y.

    2011-12-01

    Significant uncertainties exist in the temperature response of the climate system to changes in the levels of atmospheric carbon dioxide. We report progress to quantify the uncertainties of equilibrium climate sensitivity using perturbed parameter ensembles of the Community Earth System Model (CESM). Through a strategic initiative at the Lawrence Livermore National Laboratory, we have been developing uncertainty quantification (UQ) methods and incorporating them into a software framework called the UQ Pipeline. We have applied this framework to generate a large number of ensemble simulations using Latin Hypercube and other schemes to sample up to three dozen uncertain parameters in the atmospheric (CAM) and sea ice (CICE) model components of CESM. The parameters sampled are related to many highly uncertain processes, including deep and shallow convection, boundary layer turbulence, cloud optical and microphysical properties, and sea ice albedo. An extensive ensemble database comprised of more than 46,000 simulated climate-model-years of recent climate conditions has been assembled. This database is being used to train surrogate models of CESM responses and to perform statistical calibrations of the CAM and CICE models given observational data constraints. The calibrated models serve as a basis for propagating uncertainties forward through climate change simulations using a slab ocean model configuration of CESM. This procedure is being used to quantify the probability density function of equilibrium climate sensitivity accounting for uncertainties in climate model processes. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013. (LLNL-ABS-491765)

  11. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  12. Quantification of low levels of fluorine content in thin films

    NASA Astrophysics Data System (ADS)

    Ferrer, F. J.; Gil-Rostra, J.; Terriza, A.; Rey, G.; Jiménez, C.; García-López, J.; Yubero, F.

    2012-03-01

    Fluorine quantification in thin film samples containing different amounts of fluorine atoms was accomplished by combining proton-Rutherford Backscattering Spectrometry (p-RBS) and proton induced gamma-ray emission (PIGE) using proton beams of 1550 and 2330 keV for p-RBS and PIGE measurements, respectively. The capabilities of the proposed quantification method are illustrated with examples of the analysis of a series of samples of fluorine-doped tin oxides, fluorinated silica, and fluorinated diamond-like carbon films. It is shown that this procedure allows the quantification of F contents as low as 1 at.% in thin films with thicknesses in the 100-400 nm range.

  13. Wireless accelerometer reflex quantification system characterizing response and latency.

    PubMed

    LeMoyne, Robert; Coroian, Cristian; Mastroianni, Timothy

    2009-01-01

    The evaluation of the deep tendon reflex is a standard aspect of a neurological evaluation, which is frequently evoked through the patellar tendon reflex. Important features of the reflex are response and latency, providing insight to status for peripheral neuropathy and upper motor neuron syndrome. A wireless accelerometer reflex quantification system has been developed, tested, and evaluated. The reflex input is derived from a potential energy setting. Wireless accelerometers characterize the reflex hammer strike and reflex response acceleration waveforms, enabling the quantification of reflex response and latency. Spectral analysis of the reflex response acceleration waveform elucidates the frequency domain, opening the potential for new reflex classification metrics. The wireless accelerometer reflex quantification system yields accurate and consistent quantification of reflex response and latency.

  14. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  15. Multiphysics modeling and uncertainty quantification for an active composite reflector

    NASA Astrophysics Data System (ADS)

    Peterson, Lee D.; Bradford, S. C.; Schiermeier, John E.; Agnes, Gregory S.; Basinger, Scott A.

    2013-09-01

    A multiphysics, high resolution simulation of an actively controlled, composite reflector panel is developed to extrapolate from ground test results to flight performance. The subject test article has previously demonstrated sub-micron corrected shape in a controlled laboratory thermal load. This paper develops a model of the on-orbit performance of the panel under realistic thermal loads, with an active heater control system, and performs an uncertainty quantification of the predicted response. The primary contribution of this paper is the first reported application of the Sandia developed Sierra mechanics simulation tools to a spacecraft multiphysics simulation of a closed-loop system, including uncertainty quantification. The simulation was developed so as to have sufficient resolution to capture the residual panel shape error that remains after the thermal and mechanical control loops are closed. An uncertainty quantification analysis was performed to assess the predicted tolerance in the closed-loop wavefront error. Key tools used for the uncertainty quantification are also described.

  16. A quantification model for the structure of clay materials.

    PubMed

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  17. Software-assisted serum metabolite quantification using NMR.

    PubMed

    Jung, Young-Sang; Hyeon, Jin-Seong; Hwang, Geum-Sook

    2016-08-31

    The goal of metabolomics is to analyze a whole metabolome under a given set of conditions, and accurate and reliable quantitation of metabolites is crucial. Absolute concentration is more valuable than relative concentration; however, the most commonly used method in NMR-based serum metabolic profiling, bin-based and full data point peak quantification, provides relative concentration levels of metabolites and are not reliable when metabolite peaks overlap in a spectrum. In this study, we present the software-assisted serum metabolite quantification (SASMeQ) method, which allows us to identify and quantify metabolites in NMR spectra using Chenomx software. This software uses the ERETIC2 utility from TopSpin to add a digitally synthesized peak to a spectrum. The SASMeQ method will advance NMR-based serum metabolic profiling by providing an accurate and reliable method for absolute quantification that is superior to bin-based quantification. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Sentinel Lymph Node Biopsy: Quantification of Lymphedema Risk Reduction

    DTIC Science & Technology

    2006-10-01

    Quantification of Lymphedema Risk Reduction PRINCIPAL INVESTIGATOR: Andrea L. Cheville, M.D. CONTRACTING ORGANIZATION: University of...Biopsy: Quantification of Lymphedema Risk Reduction 5b. GRANT NUMBER DAMD17-00-1-0649 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Andrea L. Cheville...Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Lymphedema is a common complication of primary breast cancer therapy. It

  19. A regularized method for peptide quantification.

    PubMed

    Yang, Chao; Yang, Can; Yu, Weichuan

    2010-05-07

    Peptide abundance estimation is generally the first step in protein quantification. In peptide abundance estimation, peptide overlapping and peak intensity variation are two challenges. The main objective of this paper is to estimate peptide abundance by taking advantage of peptide isotopic distribution and smoothness of peptide elution profile. Our method proposes to solve the peptide overlapping problem and provides a way to control the variance of estimation. We compare our method with a commonly used method on simulated data sets and two real data sets of standard protein mixtures. The results show that our method achieves more accurate estimation of peptide abundance on different samples. In our method, there is a variance-related parameter. Considering the well-known trade-off between the variance and the bias of estimation, we should not only focus on reducing the variance in real applications. A suggestion about parameter selection is given based on the discussion of variance and bias. Matlab source codes and detailed experimental results are available at http://bioinformatics.ust.hk/PeptideQuant/peptidequant.htm.

  20. Quantification of perceived macro-uniformity

    NASA Astrophysics Data System (ADS)

    Lee, Ki-Youn; Bang, Yousun; Choh, Heui-Keun

    2011-01-01

    Macro-uniformity refers to the subjective impression of overall uniformity in the print sample. By the efforts of INCITS W1.1 team, macro-uniformity is categorized into five types of attributes: banding, streaks, mottle, gradients, and moiré patterns, and the ruler samples are generated with perceptual scales. W1.1 macro-uniformity ruler is useful for judging the levels of print defect, but it is not an easy task to reproduce the samples having the same perceptual scales at different times in different places. An objective quantification method is more helpful and convenient for developers to analyze print quality and design printing system components. In this paper, we propose a method for measuring perceived macro-uniformity for a given print using a flat-bed scanner. First, banding, 2D noise, and gradients are separately measured, and they are converted to the perceptual scales based on subjective results of each attribute. The correlation coefficients between the measured values of the attributes and the perceptual scales are 0.92, 0.97, and 0.86, respectively. Another subjective test is performed to find the relationship between the overall macro-uniformity and the three attributes. The weighting factors are obtained by the experimental result, and the final macro-uniformity grade is determined by the weighted sums of each attribute.

  1. Quantification of periodic breathing in premature infants

    PubMed Central

    Mohr, Mary A.; Fairchild, Karen D.; Patel, Manisha; Sinkin, Robert A.; Clark, Matthew T.; Moorman, J. Randall; Lake, Douglas E.; Kattwinkel, John; Delos, John B.

    2015-01-01

    Background Periodic breathing (PB), regular cycles of short apneic pauses and breaths, is common in newborn infants. To characterize normal and potentially pathologic PB, we used our automated apnea detection system and developed a novel method for quantifying PB. We identified a preterm infant who died of SIDS and who, on review of her breathing pattern while in the NICU, had exaggerated PB. Methods We analyzed the chest impedance signal for short apneic pauses and developed a wavelet transform method to identify repetitive 10–40 second cycles of apnea/breathing. Clinical validation was performed to distinguish PB from apnea clusters and determine the wavelet coefficient cutoff having optimum diagnostic utility. We applied this method to analyze the chest impedance signals throughout the entire NICU stays of all 70 infants born at 32 weeks’ gestation admitted over a two-and-a-half year period. This group includes an infant who died of SIDS and her twin. Results For infants of 32 weeks’ gestation, the fraction of time spent in PB peaks 7–14 days after birth at 6.5%. During that time the infant that died of SIDS spent 40% of each day in PB and her twin spent 15% of each day in PB. Conclusions This wavelet transform method allows quantification of normal and potentially pathologic PB in NICU patients. PMID:26012526

  2. Quantification of biological aging in young adults.

    PubMed

    Belsky, Daniel W; Caspi, Avshalom; Houts, Renate; Cohen, Harvey J; Corcoran, David L; Danese, Andrea; Harrington, HonaLee; Israel, Salomon; Levine, Morgan E; Schaefer, Jonathan D; Sugden, Karen; Williams, Ben; Yashin, Anatoli I; Poulton, Richie; Moffitt, Terrie E

    2015-07-28

    Antiaging therapies show promise in model organism research. Translation to humans is needed to address the challenges of an aging global population. Interventions to slow human aging will need to be applied to still-young individuals. However, most human aging research examines older adults, many with chronic disease. As a result, little is known about aging in young humans. We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their "biological aging" (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older. Measured biological aging in young adults can be used to identify causes of aging and evaluate rejuvenation therapies.

  3. Quantification of the vocal folds’ dynamic displacements

    NASA Astrophysics Data System (ADS)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  4. Uncertainty quantification for systems of conservation laws

    SciTech Connect

    Poette, Gael Despres, Bruno Lucor, Didier

    2009-04-20

    Uncertainty quantification through stochastic spectral methods has been recently applied to several kinds of non-linear stochastic PDEs. In this paper, we introduce a formalism based on kinetic theory to tackle uncertain hyperbolic systems of conservation laws with Polynomial Chaos (PC) methods. The idea is to introduce a new variable, the entropic variable, in bijection with our vector of unknowns, which we develop on the polynomial basis: by performing a Galerkin projection, we obtain a deterministic system of conservation laws. We state several properties of this deterministic system in the case of a general uncertain system of conservation laws. We then apply the method to the case of the inviscid Burgers' equation with random initial conditions and we present some preliminary results for the Euler system. We systematically compare results from our new approach to results from the stochastic Galerkin method. In the vicinity of discontinuities, the new method bounds the oscillations due to Gibbs phenomenon to a certain range through the entropy of the system without the use of any adaptative random space discretizations. It is found to be more precise than the stochastic Galerkin method for smooth cases but above all for discontinuous cases.

  5. Quantification of moving target cyber defenses

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; Cybenko, George

    2015-05-01

    Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.

  6. Quality Quantification of Evaluated Cross Section Covariances

    SciTech Connect

    Varet, S.; Dossantos-Uzarralde, P.

    2015-01-15

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the {sup 85}Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations.

  7. Broadband acoustic quantification of stratified turbulence.

    PubMed

    Lavery, Andone C; Geyer, W Rockwell; Scully, Malcolm E

    2013-07-01

    High-frequency broadband acoustic scattering techniques have enabled the remote, high-resolution imaging and quantification of highly salt-stratified turbulence in an estuary. Turbulent salinity spectra in the stratified shear layer have been measured acoustically and by in situ turbulence sensors. The acoustic frequencies used span 120-600 kHz, which, for the highly stratified and dynamic estuarine environment, correspond to wavenumbers in the viscous-convective subrange (500-2500 m(-1)). The acoustically measured spectral levels are in close agreement with spectral levels measured with closely co-located micro-conductivity probes. The acoustically measured spectral shapes allow discrimination between scattering dominated by turbulent salinity microstructure and suspended sediments or swim-bladdered fish, the two primary sources of scattering observed in the estuary in addition to turbulent salinity microstructure. The direct comparison of salinity spectra inferred acoustically and by the in situ turbulence sensors provides a test of both the acoustic scattering model and the quantitative skill of acoustical remote sensing of turbulence dissipation in a strongly sheared and salt-stratified estuary.

  8. Uncertainty quantification in reacting flow modeling.

    SciTech Connect

    Le MaÒitre, Olivier P.; Reagan, Matthew T.; Knio, Omar M.; Ghanem, Roger Georges; Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  9. Quantification of ethnic differences in facial profile.

    PubMed

    Sheridan, C S; Thomas, C D; Clement, J G

    1997-03-01

    The concept of facial aesthetics is becoming increasingly important and with the expanding application of orthodontic, orthognathic, plastic and reconstructive techniques to patients from continually diversifying ethnic backgrounds, it is timely that more elaborate methods for the evaluation of facial form are adopted. The aim of the present study was to further investigate the use of Fourier shape analysis in the quantification of facial profile and to investigate differences between racial groups. One hundred and twenty-two undergraduate dental students were photographed and surveyed for information pertaining to ethnic origin. Student's t-tests revealed significant differences (p < 0.05) in higher-order (fourth- and above) Fourier harmonics between male and female profiles, as well as between intervention and non-intervention groups. A comparison of multiple means test revealed significant differences (p < 0.05) in the third-order Fourier harmonic (vertex projection) between the Asian group and three other groups--Anglo-Celtic, Eastern European and Western European. Differences correlated with convexity in the lower third of the face, which was demonstrated by Fourier reconstruction.

  10. Detection of aneuploidies by paralogous sequence quantification

    PubMed Central

    Deutsch, S; Choudhury, U; Merla, G; Howald, C; Sylvan, A; Antonarakis, S

    2004-01-01

    Background: Chromosomal aneuploidies are a common cause of congenital disorders associated with cognitive impairment and multiple dysmorphic features. Pre-natal diagnosis of aneuploidies is most commonly performed by the karyotyping of fetal cells obtained by amniocentesis or chorionic villus sampling, but this method is labour intensive and requires about 14 days to complete. Methods: We have developed a PCR based method for the detection of targeted chromosome number abnormalities termed paralogous sequence quantification (PSQ), based on the use of paralogous genes. Paralogous sequences have a high degree of sequence identity, but accumulate nucleotide substitutions in a locus specific manner. These sequence differences, which we term paralogous sequence mismatches (PSMs), can be quantified using pyrosequencing technology, to estimate the relative dosage between different chromosomes. We designed 10 assays for the detection of trisomies of chromosomes 13, 18, and 21 and sex chromosome aneuploidies. Results: We evaluated the performance of this method on 175 DNAs, highly enriched for abnormal samples. A correct and unambiguous diagnosis was given for 119 out of 120 aneuploid samples as well as for all the controls. One sample which gave an intermediate value for the chromosome 13 assays could not be diagnosed. Conclusions: Our data suggests that PSQ is a robust, easy to interpret, and easy to set up method for the diagnosis of common aneuploidies, and can be performed in less than 48 h, representing a competitive alternative for widespread use in diagnostic laboratories. PMID:15591276

  11. Mouse Polyomavirus: Propagation, Purification, Quantification, and Storage.

    PubMed

    Horníková, Lenka; Žíla, Vojtěch; Španielová, Hana; Forstová, Jitka

    2015-08-03

    Mouse polyomavirus (MPyV) is a member of the Polyomaviridae family, which comprises non-enveloped tumorigenic viruses infecting various vertebrates including humans and causing different pathogenic responses in the infected organisms. Despite the variations in host tropism and pathogenicity, the structure of the virions of these viruses is similar. The capsid, with icosahedral symmetry (ø, 45 nm, T = 7d), is composed of a shell of 72 capsomeres of structural proteins, arranged around the nucleocore containing approximately 5-kbp-long circular dsDNA in complex with cellular histones. MPyV has been one of the most studied polyomaviruses and serves as a model virus for studies of the mechanisms of cell transformation and virus trafficking, and for use in nanotechnology. It can be propagated in primary mouse cells (e.g., in whole mouse embryo cells) or in mouse epithelial or fibroblast cell lines. In this unit, propagation, purification, quantification, and storage of MPyV virions are presented.

  12. Legionella spp. isolation and quantification from greywater

    PubMed Central

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  13. Characterization and quantification of biochar alkalinity.

    PubMed

    Fidel, Rivka B; Laird, David A; Thompson, Michael L; Lawrinenko, Michael

    2017-01-01

    Lack of knowledge regarding the nature of biochar alkalis has hindered understanding of pH-sensitive biochar-soil interactions. Here we investigate the nature of biochar alkalinity and present a cohesive suite of methods for its quantification. Biochars produced from cellulose, corn stover and wood feedstocks had significant low-pKa organic structural (0.03-0.34 meq g(-1)), other organic (0-0.92 meq g(-1)), carbonate (0.02-1.5 meq g(-1)), and other inorganic (0-0.26 meq g(-1)) alkalinities. All four categories of biochar alkalinity contributed to total biochar alkalinity and are therefore relevant to pH-sensitive soil processes. Total biochar alkalinity was strongly correlated with base cation concentration, but biochar alkalinity was not a simple function of elemental composition, soluble ash, fixed carbon, or volatile matter content. More research is needed to characterize soluble biochar alkalis other than carbonates and to establish predictive relationships among biochar production parameters and the composition of biochar alkalis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Legionella spp. isolation and quantification from greywater.

    PubMed

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample.

  15. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  16. Uncertainty Quantification in High Throughput Screening ...

    EPA Pesticide Factsheets

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  17. Uncertainty Quantification of Modelling of Equiaxed Solidification

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2016-07-01

    Numerical simulations of metal alloy solidification are used to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to sparse experimental data, to which agreement can be misinterpreted due to both model and experimental uncertainty. Uncertainty quantification (UQ) and sensitivity analysis are performed on a transient model of solidification of Al-4.5 wt.% Cu in a rectangular cavity, with equiaxed (grain refined) solidification morphology. This model solves equations for momentum, temperature, and species conservation; UQ and sensitivity analysis are performed for the degree of macrosegregation. A Smolyak sparse grid algorithm is used to select input values to construct a response surface fit to model outputs. The response surface is then used as a surrogate for the solidification model to determine the sensitivities and probability density functions of the model outputs. Uncertain model inputs of interest include the secondary dendrite arm spacing, equiaxed particle size, and fraction solid at which the rigid mushy zone forms. Similar analysis was also performed on a transient model of direct chill casting of the same alloy.

  18. Raman spectroscopic quantification of milk powder constituents.

    PubMed

    McGoverin, C M; Clark, A S S; Holroyd, S E; Gordon, K C

    2010-07-12

    Raman spectroscopy has significant potential for the quantification of food products. Milk powder is an important foodstuff and ingredient that is produced on large scale (over 20 million tonnes per annum). Raman spectroscopy, unlike near- and mid-infrared spectroscopies, has not been used extensively to quantify milk powder constituents. The effect of sample presentation on spectroscopic calibrations of protein and fat for 136 New Zealand milk powders was assessed using Raman spectroscopy. Prediction models were produced to quantify a protein concentration range of 32.19-37.65% w/w for skim milk powder, and a protein concentration range of 23.34-25.02% w/w and a fat concentration range of 26.26-29.68% w/w for whole milk powder (where ratios of prediction to deviation exceeded 2.6 with one exception). The resultant calibrations were not influenced by sample orientation; the sample temperature during data collection did affect the calibrations. Calcium fortification in the form of calcium carbonate was identified within a sub-set of samples, reinforcing the efficacy of Raman spectroscopy for identifying both crystalline and non-crystalline constituents within milk powder.

  19. Classification and quantification of leaf curvature

    PubMed Central

    Liu, Zhongyuan; Jia, Liguo; Mao, Yanfei; He, Yuke

    2010-01-01

    Various mutants of Arabidopsis thaliana deficient in polarity, cell division, and auxin response are characterized by certain types of leaf curvature. However, comparison of curvature for clarification of gene function can be difficult without a quantitative measurement of curvature. Here, a novel method for classification and quantification of leaf curvature is reported. Twenty-two mutant alleles from Arabidopsis mutants and transgenic lines deficient in leaf flatness were selected. The mutants were classified according to the direction, axis, position, and extent of leaf curvature. Based on a global measure of whole leaves and a local measure of four regions in the leaves, the curvature index (CI) was proposed to quantify the leaf curvature. The CI values accounted for the direction, axis, position, and extent of leaf curvature in all of the Arabidopsis mutants grown in growth chambers. Comparison of CI values between mutants reveals the spatial and temporal variations of leaf curvature, indicating the strength of the mutant alleles and the activities of the corresponding genes. Using the curvature indices, the extent of curvature in a complicated genetic background becomes quantitative and comparable, thus providing a useful tool for defining the genetic components of leaf development and to breed new varieties with leaf curvature desirable for the efficient capture of sunlight for photosynthesis and high yields. PMID:20400533

  20. Quantification of bromophenols in Islay whiskies.

    PubMed

    Bendig, Paul; Lehnert, Katja; Vetter, Walter

    2014-04-02

    Two single malt whiskies from the Scottish island Islay, i.e., Laphroiag and Lagavulin, are characterized by an iodine-like flavor associated with marine environments. In this study we investigated if this flavor impression could be due to bromophenols which are character impact compounds of marine fish and shrimps. In this study we developed a method suited for the determination of dibromo- and tribromophenols in whisky. Aliquots were O-acetylated, and quantification was carried out with gas chromatography with electron-capture negative ion mass spectrometry (GC/ECNI-MS). Both Islay whiskies contained more than 400 ng/L bromophenols with 2,6-dibromophenol being the most relevant homologue (>300 ng/L, respectively). These concentrations are at least 1 order of magnitude higher than the taste threshold of 2,6-dibromophenol in water. A third Islay whisky, Bowmore, contained ∼100 ng/L bromophenols while seventeen other whiskies from other regions in Scotland as well as from the USA, Ireland, and Germany contained at least 1 order of magnitude less than the two whiskies with the marine taste. Accordingly, bromophenols may contribute to the marine flavor and taste of Laphroaig and Lagavulin.

  1. Quantification of furandiones in ambient aerosol

    NASA Astrophysics Data System (ADS)

    Al-Naiema, Ibrahim M.; Roppo, Hannah M.; Stone, Elizabeth A.

    2017-03-01

    Furandiones are products of the photooxidation of anthropogenic volatile organic compounds (VOC), like toluene, and contribute to secondary organic aerosol (SOA). Because few molecular tracers of anthropogenic SOA are used to assess this source in ambient aerosol, developing a quantification method for furandiones holds a great importance. In this study, we developed a direct and highly sensitive gas chromatography-mass spectrometry method for the quantitative analysis of furandiones in fine particulate matter that is mainly free from interference by structurally-related dicarboxylic acids. Our application of this method in Iowa City, IA provides the first ambient measurements of four furandiones: 2,5-furandione, 3-methyl-2,5-furandione, dihydro-2,5-furandione, and dihydro-3-methyl-2,5-furandione. Furandiones were detected in all collected samples with a daily average concentration of 9.1 ± 3.8 ng m-3. The developed method allows for the accurate measurement of the furandiones concentrations in ambient aerosol, which will support future evaluation of these compounds as tracers for anthropogenic SOA and assessment of their potential health impacts.

  2. Classification and quantification of leaf curvature.

    PubMed

    Liu, Zhongyuan; Jia, Liguo; Mao, Yanfei; He, Yuke

    2010-06-01

    Various mutants of Arabidopsis thaliana deficient in polarity, cell division, and auxin response are characterized by certain types of leaf curvature. However, comparison of curvature for clarification of gene function can be difficult without a quantitative measurement of curvature. Here, a novel method for classification and quantification of leaf curvature is reported. Twenty-two mutant alleles from Arabidopsis mutants and transgenic lines deficient in leaf flatness were selected. The mutants were classified according to the direction, axis, position, and extent of leaf curvature. Based on a global measure of whole leaves and a local measure of four regions in the leaves, the curvature index (CI) was proposed to quantify the leaf curvature. The CI values accounted for the direction, axis, position, and extent of leaf curvature in all of the Arabidopsis mutants grown in growth chambers. Comparison of CI values between mutants reveals the spatial and temporal variations of leaf curvature, indicating the strength of the mutant alleles and the activities of the corresponding genes. Using the curvature indices, the extent of curvature in a complicated genetic background becomes quantitative and comparable, thus providing a useful tool for defining the genetic components of leaf development and to breed new varieties with leaf curvature desirable for the efficient capture of sunlight for photosynthesis and high yields.

  3. Uncertainty quantification in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  4. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  5. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  6. Quantification of cocontraction in spastic cerebral palsy.

    PubMed

    Ikeda, A J; Abel, M F; Granata, K P; Damiano, D L

    1998-12-01

    Antagonist cocontraction was hypothesized to limit net moment production in children with spastic diplegic cerebral palsy (CP). A second hypothesis was that concontraction would vary with joint angle. To test these hypotheses, surface EMG activity and moment data from the quadriceps and hamstrings muscle groups were obtained from children with CP and compared with normally developing children during isometric flexion and extension exertions. A biomechanical model was developed to predict individual moments produced by the agonist and antagonist muscle groups. Concontraction was defined as the percentage of the net moment that was negated by the antagonist moment. The model performed well in predicting the measured moment as illustrated by high R2 correlation coefficients and low prediction errors. The mean maximum moment produced was greater in normally developing children than children with CP in both flexion and extension. Antagonist cocontraction during extension was greater in children with CP (12.2 +/- 14.4%) than in normally developing children (4.9 +/- 3.8%), implying that antagonist cocontraction is one explanation for the observed extension weakness in children with CP. However, during flexion, cocontraction was not significantly different between the two groups. Cocontraction differed significantly with joint angle in both groups during flexion and in the normally developing children during extension. Although quantifying coactivation based on EMG activity alone produced similar results, it underestimated the effect of the antagonist. The quantification of cocontraction has potential applications for characterizing spastic muscle dysfunction and thereby improving clinical outcomes in children with CP.

  7. Quantification of isotopic turnover in agricultural systems

    NASA Astrophysics Data System (ADS)

    Braun, A.; Auerswald, K.; Schnyder, H.

    2012-04-01

    The isotopic turnover, which is a proxy for the metabolic rate, is gaining scientific importance. It is quantified for an increasing range of organisms, from microorganisms over plants to animals including agricultural livestock. Additionally, the isotopic turnover is analyzed on different scales, from organs to organisms to ecosystems and even to the biosphere. In particular, the quantification of the isotopic turnover of specific tissues within the same organism, e.g. organs like liver and muscle and products like milk and faeces, has brought new insights to improve understanding of nutrient cycles and fluxes, respectively. Thus, the knowledge of isotopic turnover is important in many areas, including physiology, e.g. milk synthesis, ecology, e.g. soil retention time of water, and medical science, e.g. cancer diagnosis. So far, the isotopic turnover is quantified by applying time, cost and expertise intensive tracer experiments. Usually, this comprises two isotopic equilibration periods. A first equilibration period with a constant isotopic input signal is followed by a second equilibration period with a distinct constant isotopic input signal. This yields a smooth signal change from the first to the second signal in the object under consideration. This approach reveals at least three major problems. (i) The input signals must be controlled isotopically, which is almost impossible in many realistic cases like free ranging animals. (ii) Both equilibration periods may be very long, especially when the turnover rate of the object under consideration is very slow, which aggravates the first problem. (iii) The detection of small or slow pools is improved by large isotopic signal changes, but large isotopic changes also involve a considerable change in the input material; e.g. animal studies are usually carried out as diet-switch experiments, where the diet is switched between C3 and C4 plants, since C3 and C4 plants differ strongly in their isotopic signal. The

  8. Quantification and Propagation of Nuclear Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Rising, Michael E.

    The use of several uncertainty quantification and propagation methodologies is investigated in the context of the prompt fission neutron spectrum (PFNS) uncertainties and its impact on critical reactor assemblies. First, the first-order, linear Kalman filter is used as a nuclear data evaluation and uncertainty quantification tool combining available PFNS experimental data and a modified version of the Los Alamos (LA) model. The experimental covariance matrices, not generally given in the EXFOR database, are computed using the GMA methodology used by the IAEA to establish more appropriate correlations within each experiment. Then, using systematics relating the LA model parameters across a suite of isotopes, the PFNS for both the uranium and plutonium actinides are evaluated leading to a new evaluation including cross-isotope correlations. Next, an alternative evaluation approach, the unified Monte Carlo (UMC) method, is studied for the evaluation of the PFNS for the n(0.5 MeV)+Pu-239 fission reaction and compared to the Kalman filter. The UMC approach to nuclear data evaluation is implemented in a variety of ways to test convergence toward the Kalman filter results and to determine the nonlinearities present in the LA model. Ultimately, the UMC approach is shown to be comparable to the Kalman filter for a realistic data evaluation of the PFNS and is capable of capturing the nonlinearities present in the LA model. Next, the impact that the PFNS uncertainties have on important critical assemblies is investigated. Using the PFNS covariance matrices in the ENDF/B-VII.1 nuclear data library, the uncertainties of the effective multiplication factor, leakage, and spectral indices of the Lady Godiva and Jezebel critical assemblies are quantified. Using principal component analysis on the PFNS covariance matrices results in needing only 2-3 principal components to retain the PFNS uncertainties. Then, using the polynomial chaos expansion (PCE) on the uncertain output

  9. Quantification of water in hydrous ringwoodite

    SciTech Connect

    Thomas, Sylvia -Monique; Jacobsen, Steven D.; Bina, Craig R.; Reichart, Patrick; Moser, Marcus; Hauri, Erik H.; Koch-Muller, Monika; Smyth, Joseph R.; Dollinger, Gunther

    2015-01-28

    Here, ringwoodite, γ-(Mg,Fe)2SiO4, in the lower 150 km of Earth’s mantle transition zone (410-660 km depth) can incorporate up to 1.5-2 wt% H2O as hydroxyl defects. We present a mineral-specific IR calibration for the absolute water content in hydrous ringwoodite by combining results from Raman spectroscopy, secondary ion mass spectrometery (SIMS) and proton-proton (pp)-scattering on a suite of synthetic Mg- and Fe-bearing hydrous ringwoodites. H2O concentrations in the crystals studied here range from 0.46 to 1.7 wt% H2O (absolute methods), with the maximum H2O in the same sample giving 2.5 wt% by SIMS calibration. Anchoring our spectroscopic results to absolute H-atom concentrations from pp-scattering measurements, we report frequency-dependent integrated IR-absorption coefficients for water in ringwoodite ranging from 78180 to 158880 L mol-1cm-2, depending upon frequency of the OH absorption. We further report a linear wavenumber IR calibration for H2O quantification in hydrous ringwoodite across the Mg2SiO4-Fe2SiO4 solid solution, which will lead to more accurate estimations of the water content in both laboratory-grown and naturally occurring ringwoodites. Re-evaluation of the IR spectrum for a natural hydrous ringwoodite inclusion in diamond from the study of the crystal contains 1.43 ± 0.27 wt% H2O, thus confirming near-maximum amounts of H2O for this sample from the transition zone.

  10. GPU-accelerated voxelwise hepatic perfusion quantification

    NASA Astrophysics Data System (ADS)

    Wang, H.; Cao, Y.

    2012-09-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10-6. The method will be useful for generating liver perfusion images in clinical settings.

  11. Quantification of asphaltene precipitation by scaling equation

    NASA Astrophysics Data System (ADS)

    Janier, Josefina Barnachea; Jalil, Mohamad Afzal B. Abd.; Samin, Mohamad Izhar B. Mohd; Karim, Samsul Ariffin B. A.

    2015-02-01

    Asphaltene precipitation from crude oil is one of the issues for the oil industry. The deposition of asphaltene occurs during production, transportation and separating process. The injection of carbon dioxide (CO2) during enhance oil recovery (EOR) is believed to contribute much to the precipitation of asphaltene. Precipitation can be affected by the changes in temperature and pressure on the crude oil however, reduction in pressure contribute much to the instability of asphaltene as compared to temperature. This paper discussed the quantification of precipitated asphaltene in crude oil at different high pressures and at constant temperature. The derived scaling equation was based on the reservoir condition with variation in the amount of carbon dioxide (CO2) mixed with Dulang a light crude oil sample used in the experiment towards the stability of asphaltene. A FluidEval PVT cell with Solid Detection System (SDS) was the instrument used to gain experimental knowledge on the behavior of fluid at reservoir conditions. Two conditions were followed in the conduct of the experiment. Firstly, a 45cc light crude oil was mixed with 18cc (40%) of CO2 and secondly, the same amount of crude oil sample was mixed with 27cc (60%) of CO2. Results showed that for a 45cc crude oil sample combined with 18cc (40%) of CO2 gas indicated a saturation pressure of 1498.37psi and asphaltene onset point was 1620psi. Then for the same amount of crude oil combined with 27cc (60%) of CO2, the saturation pressure was 2046.502psi and asphaltene onset point was 2230psi. The derivation of the scaling equation considered reservoir temperature, pressure, bubble point pressure, mole percent of the precipitant the injected gas CO2, and the gas molecular weight. The scaled equation resulted to a third order polynomial that can be used to quantify the amount of asphaltene in crude oil.

  12. Quantification of water in hydrous ringwoodite

    DOE PAGES

    Thomas, Sylvia -Monique; Jacobsen, Steven D.; Bina, Craig R.; ...

    2015-01-28

    Here, ringwoodite, γ-(Mg,Fe)2SiO4, in the lower 150 km of Earth’s mantle transition zone (410-660 km depth) can incorporate up to 1.5-2 wt% H2O as hydroxyl defects. We present a mineral-specific IR calibration for the absolute water content in hydrous ringwoodite by combining results from Raman spectroscopy, secondary ion mass spectrometery (SIMS) and proton-proton (pp)-scattering on a suite of synthetic Mg- and Fe-bearing hydrous ringwoodites. H2O concentrations in the crystals studied here range from 0.46 to 1.7 wt% H2O (absolute methods), with the maximum H2O in the same sample giving 2.5 wt% by SIMS calibration. Anchoring our spectroscopic results to absolutemore » H-atom concentrations from pp-scattering measurements, we report frequency-dependent integrated IR-absorption coefficients for water in ringwoodite ranging from 78180 to 158880 L mol-1cm-2, depending upon frequency of the OH absorption. We further report a linear wavenumber IR calibration for H2O quantification in hydrous ringwoodite across the Mg2SiO4-Fe2SiO4 solid solution, which will lead to more accurate estimations of the water content in both laboratory-grown and naturally occurring ringwoodites. Re-evaluation of the IR spectrum for a natural hydrous ringwoodite inclusion in diamond from the study of the crystal contains 1.43 ± 0.27 wt% H2O, thus confirming near-maximum amounts of H2O for this sample from the transition zone.« less

  13. Quantification of wound oedema after dermatological surgery.

    PubMed

    McGrath, E J; Kersey, P

    2009-12-01

    Postoperative wound oedema causing increased suture tension is thought to be a possible cause of scars known as suture marks. Quantification of such oedema has not previously been reported in the literature. Measures to accommodate wound oedema may include the adoption of alternative suture techniques and the use of more elastic suture materials. To quantify wound expansion after skin surgery and to identify any contributory factors, and to determine the ability of eight commonly used skin suture materials to stretch under increasing tension. Forty consecutive adult patients attending a dermatology department for routine skin surgery in December 2002 were recruited. Details including body site, nature of the lesion excised and dimensions of the open wound were recorded. The distance between entry and exit points of an untied suture at the time of skin surgery was measured and then repeated 24 h postoperatively. The ability of eight different suture materials to stretch when an increasing force was applied was measured by hanging standard weights from the sutures and measuring the suture length for each force applied. Thirty-nine patients completed the study. All wounds expanded postoperatively, with a mean lateral expansion of 1.0 mm. There was a strong association between the width of the unsutured wound after excision and the subsequent wound expansion. Commonly used sutures in skin surgery were found to be relatively inelastic at forces under 0.2 kg. The monofilament Novofil (Davis & Geck, Danbury, CT, U.S.A.) exhibited the greatest degree of stretch of those tested. There is considerable oedema in the first 24 h after skin surgery, particularly with wider excisions. This needs to be considered when choosing suturing materials and techniques to avoid excessive suture tension.

  14. Extended quantification of the generalized recurrence plot

    NASA Astrophysics Data System (ADS)

    Riedl, Maik; Marwan, Norbert; Kurths, Jürgen

    2016-04-01

    The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing structures, turbulent spatial plankton patterns, and fractals. But, it is also successfully applied to the description of spatio-temporal dynamics and the detection of regime shifts, such as in the complex Ginzburg-Landau- equation. The recurrence plot based determinism is a central measure in this framework quantifying the level of regularities in temporal and spatial structures. We extend this measure for the generalized recurrence plot considering additional operations of symmetry than the simple translation. It is tested not only on two-dimensional regular patterns and noise but also on complex spatial patterns reconstructing the parameter space of the complex Ginzburg-Landau-equation. The extended version of the determinism resulted in values which are consistent to the original recurrence plot approach. Furthermore, the proposed method allows a split of the determinism into parts which based on laminar and non-laminar regions of the two-dimensional pattern of the complex Ginzburg-Landau-equation. A comparison of these parts with a standard method of image classification, the co-occurrence matrix approach, shows differences especially in the description of patterns associated with turbulence. In that case, it seems that the extended version of the determinism allows a distinction of phase turbulence and defect turbulence by means of their spatial patterns. This ability of the proposed method promise new insights in other systems with turbulent dynamics coming from climatology, biology, ecology, and social sciences, for example.

  15. Quantification of microvessels in canine lymph nodes.

    PubMed

    Tonar, Zbynĕk; Egger, Gunter F; Witter, Kirsti; Wolfesberger, Birgitt

    2008-10-01

    Quantification of microvessels in tumors is mostly based on counts of vessel profiles in tumor hot spots. Drawbacks of this method include low reproducibility and large interobserver variance, mainly as a result of individual differences in sampling of image fields for analysis. Our aim was to test an unbiased method for quantifying microvessels in healthy and tumorous lymph nodes of dogs. The endothelium of blood vessels was detected in paraffin sections by a combination of immunohistochemistry (von Willebrand factor) and lectin histochemistry (wheat germ agglutinin) in comparison with detection of basal laminae by laminin immunohistochemistry or silver impregnation. Systematic uniform random sampling of 50 image fields was performed during photo-documentation. An unbiased counting frame (area 113,600 microm(2)) was applied to each micrograph. The total area sampled from each node was 5.68 mm(2). Vessel profiles were counted according to stereological counting rules. Inter- and intraobserver variabilities were tested. The application of systematic uniform random sampling was compared with the counting of vessel profiles in hot spots. The unbiased estimate of the number of vessel profiles per unit area ranged from 100.5 +/- 44.0/mm(2) to 442.6 +/- 102.5/mm(2) in contrast to 264 +/- 72.2/mm(2) to 771.0 +/- 108.2/mm(2) in hot spots. The advantage of using systematic uniform random sampling is its reproducibility, with reasonable interobserver and low intraobserver variance. This method also allows for the possibility of using archival material, because staining quality is not limiting as it is for image analysis, and artifacts can easily be excluded. However, this method is comparatively time-consuming.

  16. GPU-accelerated voxelwise hepatic perfusion quantification.

    PubMed

    Wang, H; Cao, Y

    2012-09-07

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10(-6). The method will be useful for generating liver perfusion images in clinical settings.

  17. Quantification and Reconstruction in Photoacoustic Tomography

    NASA Astrophysics Data System (ADS)

    Guo, Zijian

    Optical absorption is closely associated with many physiological important parameters, such as the concentration and oxygen saturation of hemoglobin. Conventionally, accurate quantification in PAT requires knowledge of the optical fluence attenuation, acoustic pressure attenuation, and detection bandwidth. We circumvent this requirement by quantifying the optical absorption coefficients from the acoustic spectra of PA signals acquired at multiple optical wavelengths. We demonstrate the method using the optical-resolution photoacoustic microscopy (OR-PAM) and the acoustical-resolution photoacoustic microscopy (AR-PAM) in the optical ballistic regime and in the optical diffusive regime, respectively. The data acquisition speed in photoacoustic computed tomography (PACT) is limited by the laser repetition rate and the number of parallel ultrasound detecting channels. Reconstructing an image with fewer measurements can effectively accelerate the data acquisition and reduce the system cost. We adapted Compressed Sensing (CS) for the reconstruction in PACT. CS-based PACT was implemented as a non-linear conjugate gradient descent algorithm and tested with both phantom and in vivo experiments. Speckles have been considered ubiquitous in all scattering-based coherent imaging technologies. As a coherent imaging modality based on optical absorption, photoacoustic (PA) tomography (PAT) is generally devoid of speckles. PAT suppresses speckles by building up prominent boundary signals, via a mechanism similar to that of specular reflection. When imaging smooth boundary absorbing targets, the speckle visibility in PAT, which is defined as the ratio of the square root of the average power of speckles to that of boundaries, is inversely proportional to the square root of the absorber density. If the surfaces of the absorbing targets have uncorrelated height fluctuations, however, the boundary features may become fully developed speckles. The findings were validated by simulations

  18. Bistatic Sonar and Quantification of Seafloor Processes

    NASA Astrophysics Data System (ADS)

    McCloghrie, P.; BLONDEL, P.; Pace, N. G.; Heald, G. J.; Brothers, R.

    2001-12-01

    Sonar has proved the best tool for investigation of seafloor processes. Calibrated sonars provide a wealth of quantitative information unattainable through other means, but are limited to backscattering geometries, where the acoustic source and receiver are on the same platform. Recent developments in acoustic theory and design have shown the advantage of bistatic instruments, where the source and receiver(s) are physically decoupled and can be anywhere in the water column. Although there are many theoretical studies of bistatic scattering, their applicability is limited by the very low number of actual experiments. Using the large tank facilities at the University of Bath, we are currently investigating the bistatic scattering from several realistic types of seabeds and different morphologies, at high frequency (250 kHz). These studies are used in conjunction with state-of-the-art acoustic models (Blondel et al., 2001), and compared with data from complementary sea trials (Pace et al., in prep.). The results show the huge potential of bistatic systems for accurate and detailed mapping of seafloor structures and their topography. We also demonstrate how the relative contributions of surface and volume processes to acoustic scattering change with the imaging geometry, and how this can be used to maximise the information gained during mapping. Versatile and efficient, bistatic systems can be deployed from surface vessels, ROVs or AUVs. These new tools can be used for Rapid Environmental Assessment, to study sediment transport and deposition, and to access the detailed morphology of the seabed and the near sub-surface. They can in particular be used for the investigation of the small-scale structure of sand ridges and ripples, the distribution of tidal or glacial deposits on the seabed, and the quantification of multi-scale surface roughness in sedimentary and non-sedimentary terrains alike. Crown Copyright 2001 DERA. Published with the permission of the Defence

  19. GPU-Accelerated Voxelwise Hepatic Perfusion Quantification

    PubMed Central

    Wang, H; Cao, Y

    2012-01-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using CUDA-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, non-linear least squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626400 voxels in a patient’s liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10−6. The method will be useful for generating liver perfusion images in clinical settings. PMID:22892645

  20. Uncertainty quantification for optical model parameters

    DOE PAGES

    Lovell, A. E.; Nunes, F. M.; Sarich, J.; ...

    2017-02-21

    Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of our work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fitmore » and create corresponding 95% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. Here, we study a number of reactions involving neutron and deuteron projectiles with energies in the range of 5–25 MeV/u, on targets with mass A=12–208. We investigate the correlations between the parameters in the fit. The case of deuterons on 12C is discussed in detail: the elastic-scattering fit and the prediction of 12C(d,p)13C transfer angular distributions, using both uncorrelated and correlated χ2 minimization functions. The general features for all cases are compiled in a systematic manner to identify trends. This work shows that, in many cases, the correlated χ2 functions (in comparison to the uncorrelated χ2 functions) provide a more natural parameterization of the process. These correlated functions do, however, produce broader confidence bands. Further optimization may require improvement in the models themselves and/or more information included in the fit.« less

  1. Automated and nonbiased regional quantification of functional neuroimaging data.

    PubMed

    Parker, Jason G; Speidel, Benjamin; Sherwood, Matthew S

    2015-02-01

    In the quantification of functional neuroimaging data, region-of-interest (ROI) analysis can be used to assess a variety of properties of the activation signal, but taken alone these properties are susceptible to noise and may fail to accurately describe overall regional involvement. Here, the authors present and evaluate an automated method for quantification and localization of functional neuroimaging data that combines multiple properties of the activation signal to generate rank-order lists of regional activation results. The proposed automated quantification method, referred to as neuroimaging results decomposition (NIRD), begins by decomposing an activation map into a hierarchical list of ROIs using a digital atlas. In an intermediate step, the ROIs are rank-ordered according to extent, mean intensity, and total intensity. A final rank-order list (NIRD average rank) is created by sorting the ROIs according to the average of their ranks from the intermediate step. The authors hypothesized that NIRD average rank would have improved regional quantification accuracy compared to all other quantitative metrics, including methods based on properties of statistical clusters. To test their hypothesis, NIRD rankings were directly compared to three common cluster-based methods using simulated fMRI data both with and without realistic head motion. For both the no-motion and motion datasets, an analysis of variance found that significant differences between the quantification methods existed (F = 64.8, p < 0.0001 for no motion; F = 55.2, p < 0.0001 for motion), and a post-hoc test found that NIRD average rank was the most accurate quantification method tested (p < 0.05 for both datasets). Furthermore, all variants of the NIRD method were found to be significantly more accurate than the cluster-based methods in all cases. These results confirm their hypothesis and demonstrate that the proposed NIRD methodology provides improved regional quantification accuracy compared to

  2. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations.

    PubMed

    Higgs, Richard E; Butler, Jon P; Han, Bomie; Knierman, Michael D

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference.

  3. Cytomegalovirus quantification: where to next in optimising patient management?

    PubMed

    Atkinson, Claire; Emery, Vincent C

    2011-08-01

    Over the years quantification of cytomegalovirus (HCMV) load in blood has become a mainstay of clinical management helping direct deployment of antiviral therapy, assess response to therapy and highlight cases of drug resistance. The review focuses on a brief historical perspective of HCMV quantification and the ways in which viral load is being used to improve patient management. A review of the published literature and also personal experience at the Royal Free Hospital. Quantification of HCMV is essential for efficient patient management. The ability to use real time quantitative PCR to drive pre-emptive therapy has improved patient management after transplantation although the threshold viral loads for deployment differ between laboratories. The field would benefit from access to a universal standard for quantification. We see that HCMV quantification will continue to be central to delivering individualised patient management and facilitating multicentre trials of new antiviral agents and vaccines in a variety of clinical settings. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  5. Absolute and relative quantification of RNA modifications via biosynthetic isotopomers

    PubMed Central

    Kellner, Stefanie; Ochel, Antonia; Thüring, Kathrin; Spenkuch, Felix; Neumann, Jennifer; Sharma, Sunny; Entian, Karl-Dieter; Schneider, Dirk; Helm, Mark

    2014-01-01

    In the resurging field of RNA modifications, quantification is a bottleneck blocking many exciting avenues. With currently over 150 known nucleoside alterations, detection and quantification methods must encompass multiple modifications for a comprehensive profile. LC–MS/MS approaches offer a perspective for comprehensive parallel quantification of all the various modifications found in total RNA of a given organism. By feeding 13C-glucose as sole carbon source, we have generated a stable isotope-labeled internal standard (SIL-IS) for bacterial RNA, which facilitates relative comparison of all modifications. While conventional SIL-IS approaches require the chemical synthesis of single modifications in weighable quantities, this SIL-IS consists of a nucleoside mixture covering all detectable RNA modifications of Escherichia coli, yet in small and initially unknown quantities. For absolute in addition to relative quantification, those quantities were determined by a combination of external calibration and sample spiking of the biosynthetic SIL-IS. For each nucleoside, we thus obtained a very robust relative response factor, which permits direct conversion of the MS signal to absolute amounts of substance. The application of the validated SIL-IS allowed highly precise quantification with standard deviations <2% during a 12-week period, and a linear dynamic range that was extended by two orders of magnitude. PMID:25129236

  6. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  7. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an

  8. Uncertainty quantification for holographic interferographic images

    NASA Astrophysics Data System (ADS)

    Centauri, Laurie Ann

    Current comparison methods for experimental and simulated holographic interferometric images are qualitative in nature. Previous comparisons of holographic interferometric images with computational fluid dynamics (CFD) simulations for validation have been performed qualitatively through visual comparison by a data analyst. By validating the experiments and CFD simulations in a quantifiable manner using a consistency analysis, the validation becomes a repeatable process that gives a consistency measure and a range of inputs over which the experiments and CFD simulations give consistent results. The quantification of uncertainty in four holographic interferometric experiments was performed for use in a data collaboration with CFD simulations for the purpose of validation. The model uncertainty from image-processing, the measurement uncertainty from experimental data variation, and the scenario uncertainty from the bias and parameter uncertainty was quantified. The scenario uncertainty was determined through comparison with an analytical solution at the helium inlet (height, x = 0), including the uncertainty in the experimental parameters from historical weather data. The model uncertainty was calculated through a Box-Behnkin sensitivity analysis on three image-processing code parameters. Measurement uncertainty was determined through a statistical analysis to determine the time-average and standard deviation in the interference fringe positions. An experimental design matrix of CFD simulations was performed by Weston Eldredge using a Box-Behnkin design with helium velocity, temperature, and air co-flow velocity as parameters in conjunction to provide simulated measurements for the data collaboration Data set. Over 3,200 holographic interferometric images were processed through the course of this study. When each permutation of these images is taken into account through all the image-processing steps, the total number of images processed is over 13,000. Probability

  9. Methane bubbling: from speculation to quantification

    NASA Astrophysics Data System (ADS)

    Grinham, A. R.; Dunbabin, M.; Yuan, Z.

    2013-12-01

    magnitude from 500 to 100 000 mg m-2 d-1 depending on time of day and water depth. Average storage bubble flux rates between reservoirs varied by two orders of magnitude from 1 200 to 15 000 mg m-2 d-1, with the primary driver likely to be catchment forest cover. The relative contribution of bubbling to total fluxes varied from 10% to more than 90% depending on the reservoir and time of sampling. This method was consistently shown to greatly improve the spatial mapping and quantification of methane bubbling rates from reservoir surfaces and reduces the uncertainty associated with the determining the relative contribution of bubbling to total flux.

  10. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  11. Uncertainty Quantification in Climate Modeling and Projection

    SciTech Connect

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  12. Rapid and portable electrochemical quantification of phosphorus.

    PubMed

    Kolliopoulos, Athanasios V; Kampouris, Dimitrios K; Banks, Craig E

    2015-04-21

    Phosphorus is one of the key indicators of eutrophication levels in natural waters where it exists mainly as dissolved phosphorus. Various analytical protocols exist to provide an offsite analysis, and a point of site analysis is required. The current standard method recommended by the Environmental Protection Agency (EPA) for the detection of total phosphorus is colorimetric and based upon the color of a phosphomolybdate complex formed as a result of the reaction between orthophosphates and molybdates ions where ascorbic acid and antimony potassium tartrate are added and serve as reducing agents. Prior to the measurements, all forms of phosphorus are converted into orthophosphates via sample digestion (heating and acidifying). The work presented here details an electrochemical adaptation of this EPA recommended colorimetric approach for the measurement of dissolved phosphorus in water samples using screen-printed graphite macroelectrodes for the first time. This novel indirect electrochemical sensing protocol allows the determination of orthophosphates over the range from 0.5 to 20 μg L(-1) in ideal pH 1 solutions utilizing cyclic voltammetry with a limit of detection (3σ) found to correspond to 0.3 μg L(-1) of phosphorus. The reaction time and influence of foreign ions (potential interferents) upon this electroanalytical protocol was also investigated, where it was found that a reaction time of 5 min, which is essential in the standard colorimetric approach, is not required in the new proposed electrochemically adapted protocol. The proposed electrochemical method was independently validated through the quantification of orthophosphates and total dissolved phosphorus in polluted water samples (canal water samples) with ion chromatography and ICP-OES, respectively. This novel electrochemical protocol exhibits advantages over the established EPA recommended colorimetric determination for total phosphorus with lower detection limits and shorter experimental times

  13. SERS Quantification and Characterization of Proteins and Other Biomolecules.

    PubMed

    Feliu, Neus; Hassan, Moustapha; Garcia Rico, Eduardo; Cui, Daxiang; Parak, Wolfgang; Alvarez-Puebla, Ramon

    2017-09-26

    Changes in protein expression levels and protein structure may indicate genomic mutations and may be related to some diseases. Therefore, the precise quantification and characterization of proteins can be used for disease diagnosis. Compared with several other alternative methods, surface-enhanced Raman scattering (SERS) spectroscopy is regarded as an excellent choice for the quantification and structural characterization of proteins. Herein, we review the main advance of using plasmonic nanostructures as SERS sensing platform for this purpose. Three design approaches, including direct SERS, indirect SERS, and SERS-encoded nanoparticles, are discussed in the direction of developing new precise approaches of quantification and characterization of proteins. While this Review is focused on proteins, in order to highlight concepts of SERS-based sensors also detection of other biomolecules will be discussed.

  14. Next generation of food allergen quantification using mass spectrometric systems.

    PubMed

    Koeberl, Martina; Clarke, Dean; Lopata, Andreas L

    2014-08-01

    Food allergies are increasing worldwide and becoming a public health concern. Food legislation requires detailed declarations of potential allergens in food products and therefore an increased capability to analyze for the presence of food allergens. Currently, antibody-based methods are mainly utilized to quantify allergens; however, these methods have several disadvantages. Recently, mass spectrometry (MS) techniques have been developed and applied to food allergen analysis. At present, 46 allergens from 11 different food sources have been characterized using different MS approaches and some specific signature peptides have been published. However, quantification of allergens using MS is not routinely employed. This review compares the different aspects of food allergen quantification using advanced MS techniques including multiple reaction monitoring. The latter provides low limits of quantification for multiple allergens in simple or complex food matrices, while being robust and reproducible. This review provides an overview of current approaches to analyze food allergens, with specific focus on MS systems and applications.

  15. Isobaric Labeling-Based Relative Quantification in Shotgun Proteomics

    PubMed Central

    2015-01-01

    Mass spectrometry plays a key role in relative quantitative comparisons of proteins in order to understand their functional role in biological systems upon perturbation. In this review, we review studies that examine different aspects of isobaric labeling-based relative quantification for shotgun proteomic analysis. In particular, we focus on different types of isobaric reagents and their reaction chemistry (e.g., amine-, carbonyl-, and sulfhydryl-reactive). Various factors, such as ratio compression, reporter ion dynamic range, and others, cause an underestimation of changes in relative abundance of proteins across samples, undermining the ability of the isobaric labeling approach to be truly quantitative. These factors that affect quantification and the suggested combinations of experimental design and optimal data acquisition methods to increase the precision and accuracy of the measurements will be discussed. Finally, the extended application of isobaric labeling-based approach in hyperplexing strategy, targeted quantification, and phosphopeptide analysis are also examined. PMID:25337643

  16. Quantification of neural functional connectivity during an active avoidance task.

    PubMed

    Silva, Catia S; Hazrati, Mehrnaz K; Keil, Andreas; Principe, Jose C; Silva, Catia S; Hazrati, Mehrnaz K; Keil, Andreas; Principe, Jose C; Keil, Andreas; Principe, Jose C; Hazrati, Mehrnaz K; Silva, Catia S

    2016-08-01

    Many behavioral and cognitive processes are associated with spatiotemporal dynamic communication between brain areas. Thus, the quantification of functional connectivity with high temporal resolution is highly desirable for capturing in vivo brain function. However, brain functional network quantification from EEG recordings has been commonly used in a qualitative manner. In this paper, we consider pairwise dependence measures as random variables and estimate the pdf for each electrode of the arrangement. A metric imposed by the quadratic Cauchy-Schwartz Mutual Information quantifies these pdfs. We present the results by brain regions simplifying the analysis and visualization drastically. The proposed metric of functional connectivity quantification is addressed for temporal dependencies of the brain network that can be related to the task.

  17. Development of a Protein Standard Absolute Quantification (PSAQ™) assay for the quantification of Staphylococcus aureus enterotoxin A in serum.

    PubMed

    Adrait, Annie; Lebert, Dorothée; Trauchessec, Mathieu; Dupuis, Alain; Louwagie, Mathilde; Masselon, Christophe; Jaquinod, Michel; Chevalier, Benoît; Vandenesch, François; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-06-06

    Enterotoxin A (SEA) is a staphylococcal virulence factor which is suspected to worsen septic shock prognosis. However, the presence of SEA in the blood of sepsis patients has never been demonstrated. We have developed a mass spectrometry-based assay for the targeted and absolute quantification of SEA in serum. To enhance sensitivity and specificity, we combined an immunoaffinity-based sample preparation with mass spectrometry analysis in the selected reaction monitoring (SRM) mode. Absolute quantification of SEA was performed using the PSAQ™ method (Protein Standard Absolute Quantification), which uses a full-length isotope-labeled SEA as internal standard. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) were estimated at 352pg/mL and 1057pg/mL, respectively. SEA recovery after immunocapture was determined to be 7.8±1.4%. Therefore, we assumed that less than 1femtomole of each SEA proteotypic peptide was injected on the liquid chromatography column before SRM analysis. From a 6-point titration experiment, quantification accuracy was determined to be 77% and precision at LLOQ was lower than 5%. With this sensitive PSAQ-SRM assay, we expect to contribute to decipher the pathophysiological role of SEA in severe sepsis. This article is part of a Special Issue entitled: Proteomics: The clinical link. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Graphene oxide interfaces in serum based autoantibody quantification.

    PubMed

    Xu, Qiao; Cheng, Ho; Lehr, Joshua; Patil, Amol V; Davis, Jason J

    2015-01-06

    A reliable quantification of protein markers will undoubtedly underpin profound developments in disease surveillance, diagnostics, and improved therapy. Although there potentially exist numerous means of achieving this, electrochemical impedimetric techniques offer scale of sensitivity, cost, convenience, and a flexibility with which few alternatives can compete. Though there have been marked developments in electroanalytical protein detection, the demands associated with accessing the inherent assay sensitivity in complex biological media largely remains. We report herein the use of cysteamine-graphene oxide modified gold microelectrode arrays in underpinning the ultrasensitive and entirely label free non-faradaic quantification of Parkinson's-relevant autoantibodies in human serum.

  19. Brief review of uncertainty quantification for particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Farias, M. H.; Teixeira, R. S.; Koiller, J.; Santos, A. M.

    2016-07-01

    Metrological studies for particle image velocimetry (PIV) are recent in literature. An attempt to evaluate the uncertainty quantifications (UQ) of the PIV velocity field are in evidence. Therefore, a short review on main sources of uncertainty in PIV and available methodologies for its quantification are presented. In addition, the potential of some mathematical techniques, coming from the area of geometric mechanics and control, that could interest the fluids UQ community are highlighted as good possibilities. “We must measure what is measurable and make measurable what cannot be measured” (Galileo)

  20. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  1. A highly sensitive quantification of phytosterols through an inexpensive derivatization.

    PubMed

    Liu, Songbai; Ruan, Huina

    2013-01-01

    A highly sensitive method for quantification of phytosterols based on HPLC has been developed by derivatization with the benzoyl chromophore. Introduction of the chromophore, benzoyl group, to phytosterols via simple and inexpensive derivatization greatly improved the UV response at 254 nm. Quantification of phytosterols was effectively performed by HPLC analysis with methyl benzoate as the internal standard after derivatization. This new method demonstrated outstanding yield of recovery (> 95%) and excellent sensitivity (ng level) and was applicable for sterols from either plant or animal sources. This method is generally useful in phytosterol studies.

  2. Uncertainty Quantification for Safety Verification Applications in Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Boafo, Emmanuel

    There is an increasing interest in computational reactor safety analysis to systematically replace the conservative calculations by best estimate calculations augmented by quantitative uncertainty analysis methods. This has been necessitated by recent regulatory requirements that have permitted the use of such methods in reactor safety analysis. Stochastic uncertainty quantification methods have shown great promise, as they are better suited to capture the complexities in real engineering problems. This study proposes a framework for performing uncertainty quantification based on the stochastic approach, which can be applied to enhance safety analysis. (Abstract shortened by ProQuest.).

  3. A quick colorimetric method for total lipid quantification in microalgae.

    PubMed

    Byreddy, Avinesh R; Gupta, Adarsha; Barrow, Colin J; Puri, Munish

    2016-06-01

    Discovering microalgae with high lipid productivity are among the key milestones for achieving sustainable biodiesel production. Current methods of lipid quantification are time intensive and costly. A rapid colorimetric method based on sulfo-phospho-vanillin (SPV) reaction was developed for the quantification of microbial lipids to facilitate screening for lipid producing microalgae. This method was successfully tested on marine thraustochytrid strains and vegetable oils. The colorimetric method results correlated well with gravimetric method estimates. The new method was less time consuming than gravimetric analysis and is quantitative for lipid determination, even in the presence of carbohydrates, proteins and glycerol.

  4. Reliability quantification and visualization for electric microgrids

    NASA Astrophysics Data System (ADS)

    Panwar, Mayank

    and parallel with the area Electric Power Systems (EPS), (3) includes the local EPS and may include portions of the area EPS, and (4) is intentionally planned. A more reliable electric power grid requires microgrids to operate in tandem with the EPS. The reliability can be quantified through various metrics for performance measure. This is done through North American Electric Reliability Corporation (NERC) metrics in North America. The microgrid differs significantly from the traditional EPS, especially at asset level due to heterogeneity in assets. Thus, the performance cannot be quantified by the same metrics as used for EPS. Some of the NERC metrics are calculated and interpreted in this work to quantify performance for a single asset and group of assets in a microgrid. Two more metrics are introduced for system level performance quantification. The next step is a better representation of the large amount of data generated by the microgrid. Visualization is one such form of representation which is explored in detail and a graphical user interface (GUI) is developed as a deliverable tool to the operator for informative decision making and planning. Electronic appendices-I and II contain data and MATLAB© program codes for analysis and visualization for this work.

  5. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  6. Rapid Quantification of Energy Absorption and Dissipation Metrics for PPE Padding Materials

    DTIC Science & Technology

    2010-01-22

    Oct 2009 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS Rapid Quantification of Energy Absorption & Dissipation Metrics for PPE 55332EGII Padding ...Engineering Seminar Series, Spring 2009 [2] Rapid Quantification of Energy Absorption & Dissipation Metrics for PPE Padding Materials: Final...55332EGII – Part2.pdf, and Part 3, respectively. [1] Rapid Quantification of Energy Absorption & Dissipation Metrics for PPE Padding Materials: Final

  7. Enlisting Ecosystem Benefits: Quantification and Valuation of Ecosystem Services to Inform Installation Management

    DTIC Science & Technology

    2015-05-27

    RC-201113) April 2015 Enlisting Ecosystem Services: Quantification and Valuation of Ecosystem Services to Inform...19  6.1.2  Quantification, Valuation, and Mapping of Ecosystem Services and their Tradeoffs (PO 1-6 & 8-9...Quantification, Valuation, and Mapping of Ecosystem Services and their Tradeoffs (PO 1-6 & 8-9

  8. Colorimetric Quantification and in Situ Detection of Collagen

    ERIC Educational Resources Information Center

    Esteban, Francisco J.; del Moral, Maria L.; Sanchez-Lopez, Ana M.; Blanco, Santos; Jimenez, Ana; Hernandez, Raquel; Pedrosa, Juan A.; Peinado, Maria A.

    2005-01-01

    A simple multidisciplinary and inexpensive laboratory exercise is proposed, in which the undergraduate student may correlate biochemical and anatomical findings. The entire practical session can be completed in one 2.5-3 hour laboratory period, and consists of the quantification of collagen and total protein content from tissue sections--without…

  9. Infectious Viral Quantification of Chikungunya Virus-Virus Plaque Assay.

    PubMed

    Kaur, Parveen; Lee, Regina Ching Hua; Chu, Justin Jang Hann

    2016-01-01

    The plaque assay is an essential method for quantification of infectious virus titer. Cells infected with virus particles are overlaid with a viscous substrate. A suitable incubation period results in the formation of plaques, which can be fixed and stained for visualization. Here, we describe a method for measuring Chikungunya virus (CHIKV) titers via virus plaque assays.

  10. Adenovirus particle quantification in cell lysates using light scattering.

    PubMed

    Hohl, Adrian; Ramms, Anne Sophie; Dohmen, Christian; Mantwill, Klaus; Bielmeier, Andrea; Kolk, Andreas; Ruppert, Andreas; Nawroth, Roman; Holm, Per Sonne

    2017-08-15

    Adenoviral vector production for therapeutic applications is a well-established routine process. However, current methods for measurement of adenovirus particle titers as a quality characteristic require highly purified virus preparations. While purified virus is typically obtained in the last step of downstream purification, rapid and reliable methods for adenovirus particle quantification in intermediate products and crude lysates to allow for optimization and validation of cell cultures and intermediate downstream processing steps are currently not at hand. Light scattering is an established process to measure virus particles' size. Though, due to cell impurities adequate quantification of adenovirus particles in cell lysates by light scattering has been impossible until today. This report describes a new method using light scattering to measure virus concentration in non-purified cell lysates. Here we report application of light scattering, a routine method to measure virus particle size, to virus quantification in enzymatically conditioned crude lysates. Samples are incubated with phospholipase A2 and benzonase and filtered through 0.22 µm filter cartridge prior to quantification by light scattering. Our results show that this treatment provides a precise method for fast and easy determination of total adenovirus particle numbers in cell lysates and is useful to monitor virus recovery throughout all downstream processing.

  11. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Treesearch

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  12. Quantification of confocal images of biofilms grown on irregular surfaces

    PubMed Central

    Ross, Stacy Sommerfeld; Tu, Mai Han; Falsetta, Megan L.; Ketterer, Margaret R.; Kiedrowski, Megan R.; Horswill, Alexander R.; Apicella, Michael A.; Reinhardt, Joseph M.; Fiegel, Jennifer

    2014-01-01

    Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515

  13. Single cell genomic quantification by non-fluorescence nonlinear microscopy

    NASA Astrophysics Data System (ADS)

    Kota, Divya; Liu, Jing

    2017-02-01

    Human epidermal growth receptor 2 (Her2) is a gene which plays a major role in breast cancer development. The quantification of Her2 expression in single cells is limited by several drawbacks in existing fluorescence-based single molecule techniques, such as low signal-to-noise ratio (SNR), strong autofluorescence and background signals from biological components. For rigorous genomic quantification, a robust method of orthogonal detection is highly desirable and we demonstrated it by two non-fluorescent imaging techniques -transient absorption microscopy (TAM) and second harmonic generation (SHG). In TAM, gold nanoparticles (AuNPs) are chosen as an orthogonal probes for detection of single molecules which gives background-free quantifications of single mRNA transcript. In SHG, emission from barium titanium oxide (BTO) nanoprobes was demonstrated which allows stable signal beyond the autofluorescence window. Her2 mRNA was specifically labeled with nanoprobes which are conjugated with antibodies or oligonucleotides and quantified at single copy sensitivity in the cancer cells and tissues. Furthermore, a non-fluorescent super-resolution concept, named as second harmonic super-resolution microscopy (SHaSM), was proposed to quantify individual Her2 transcripts in cancer cells beyond the diffraction limit. These non-fluorescent imaging modalities will provide new dimensions in biomarker quantification at single molecule sensitivity in turbid biological samples, offering a strong cross-platform strategy for clinical monitoring at single cell resolution.

  14. Two-pass alignment improves novel splice junction quantification.

    PubMed

    Veeneman, Brendan A; Shukla, Sudhanshu; Dhanasekaran, Saravana M; Chinnaiyan, Arul M; Nesvizhskii, Alexey I

    2016-01-01

    Discovery of novel splicing from RNA sequence data remains a critical and exciting focus of transcriptomics, but reduced alignment power impedes expression quantification of novel splice junctions. Here, we profile performance characteristics of two-pass alignment, which separates splice junction discovery from quantification. Per sample, across a variety of transcriptome sequencing datasets, two-pass alignment improved quantification of at least 94% of simulated novel splice junctions, and provided as much as 1.7-fold deeper median read depth over those splice junctions. We further demonstrate that two-pass alignment works by increasing alignment of reads to splice junctions by short lengths, and that potential alignment errors are readily identifiable by simple classification. Taken together, two-pass alignment promises to advance quantification and discovery of novel splicing events. arul@med.umich.edu, nesvi@med.umich.edu Two-pass alignment was implemented here as sequential alignment, genome indexing, and re-alignment steps with STAR. Full parameters are provided in Supplementary Table 2. Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Two-pass alignment improves novel splice junction quantification

    PubMed Central

    Veeneman, Brendan A.; Shukla, Sudhanshu; Dhanasekaran, Saravana M.; Chinnaiyan, Arul M.; Nesvizhskii, Alexey I.

    2016-01-01

    Motivation: Discovery of novel splicing from RNA sequence data remains a critical and exciting focus of transcriptomics, but reduced alignment power impedes expression quantification of novel splice junctions. Results: Here, we profile performance characteristics of two-pass alignment, which separates splice junction discovery from quantification. Per sample, across a variety of transcriptome sequencing datasets, two-pass alignment improved quantification of at least 94% of simulated novel splice junctions, and provided as much as 1.7-fold deeper median read depth over those splice junctions. We further demonstrate that two-pass alignment works by increasing alignment of reads to splice junctions by short lengths, and that potential alignment errors are readily identifiable by simple classification. Taken together, two-pass alignment promises to advance quantification and discovery of novel splicing events. Contact: arul@med.umich.edu, nesvi@med.umich.edu Availability and implementation: Two-pass alignment was implemented here as sequential alignment, genome indexing, and re-alignment steps with STAR. Full parameters are provided in Supplementary Table 2. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26519505

  16. Quantified PIRT and Uncertainty Quantification for Computer Code Validation

    NASA Astrophysics Data System (ADS)

    Luo, Hu

    This study is intended to investigate and propose a systematic method for uncertainty quantification for the computer code validation application. Uncertainty quantification has gained more and more attentions in recent years. U.S. Nuclear Regulatory Commission (NRC) requires the use of realistic best estimate (BE) computer code to follow the rigorous Code Scaling, Application and Uncertainty (CSAU) methodology. In CSAU, the Phenomena Identification and Ranking Table (PIRT) was developed to identify important code uncertainty contributors. To support and examine the traditional PIRT with quantified judgments, this study proposes a novel approach, the Quantified PIRT (QPIRT), to identify important code models and parameters for uncertainty quantification. Dimensionless analysis to code field equations to generate dimensionless groups (pi groups) using code simulation results serves as the foundation for QPIRT. Uncertainty quantification using DAKOTA code is proposed in this study based on the sampling approach. Nonparametric statistical theory identifies the fixed number of code run to assure the 95 percent probability and 95 percent confidence in the code uncertainty intervals.

  17. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  18. Normal Databases for the Relative Quantification of Myocardial Perfusion.

    PubMed

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S; Slomka, Piotr J

    2016-08-01

    Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied.

  19. Quantification of Wheat Grain Arabinoxylans Using a Phloroglucinol Colorimetric Assay

    USDA-ARS?s Scientific Manuscript database

    Arabinoxylans (AX) play a critical role in end-use quality and nutrition of wheat (Triticum aestivum L.). An efficient, accurate method of AX quantification is desirable as AX plays an important role in processing, end use quality and human health. The objective of this work was to evaluate a stand...

  20. Quantification and Single-Spore Detection of Phakopsora pachyrhizi

    USDA-ARS?s Scientific Manuscript database

    The microscopic identification and quantification of Phakopsora pachyrhizi spores from environmental samples, spore traps, and laboratory specimens can represent a challenge. Such reports, especially from passive spore traps, commonly describe the number of “rust-like” spores; for other forensic sa...

  1. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    USDA-ARS?s Scientific Manuscript database

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  2. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, M.; Aliberti, G.; Palmiotti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  3. Macroscopic inspection of ape feces: what's in a quantification method?

    PubMed

    Phillips, Caroline A; McGrew, William C

    2014-06-01

    Macroscopic inspection of feces has been used to investigate primate diet. The limitations of this method to identify food-items to species level have long been recognized, but ascertaining aspects of diet (e.g., folivory) are achievable by quantifying food-items in feces. Quantification methods applied include rating food-items using a scale of abundance, estimating their percentage volume, and weighing food-items. However, verification as to whether or not composition data differ, depending on which quantification method is used during macroscopic inspection, has not been done. We analyzed feces collected from ten adult chimpanzees (Pan troglodytes schweinfurthii) of the Kanyawara community in Kibale National Park, Uganda. We compare dietary composition totals obtained from using different quantification methods and ascertain if sieve mesh size influences totals calculated. Finally, this study validates findings from direct observation of feeding by the same individuals from whom the fecal samples had been collected. Contrasting diet composition totals obtained by using different quantification methods and sieve mesh sizes can influence folivory and frugivory estimates. However, our findings were based on the assumption that fibrous matter contained pith and leaf fragments only, which remains to be verified. We advocate macroscopic inspection of feces can be a valuable tool to provide a generalized overview of dietary composition for primate populations. As most populations remain unhabituated, scrutinizing and validating indirect measures are important if they are to be applied to further understand inter- and intra-species dietary variation.

  4. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Injury assessment-quantification. 990.52 Section 990.52 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION...

  5. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Injury assessment-quantification. 990.52 Section 990.52 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION...

  6. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Injury assessment-quantification. 990.52 Section 990.52 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION...

  7. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Injury assessment-quantification. 990.52 Section 990.52 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION...

  8. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Injury assessment-quantification. 990.52 Section 990.52 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION...

  9. Current Issues in the Quantification of Federal Reserved Water Rights

    NASA Astrophysics Data System (ADS)

    Brookshire, David S.; Watts, Gary L.; Merrill, James L.

    1985-11-01

    This paper examines the quantification of federal reserved water rights from legal, institutional, and economic perspectives. Special attention is directed toward Indian reserved water rights and the concept of practicably irrigable acreage. We conclude by examining current trends and exploring alternative approaches to the dilemma of quantifying Indian reserved water rights.

  10. Improved Quantification of Cerebral Vein Oxygenation Using Partial Volume Correction.

    PubMed

    Ward, Phillip G D; Fan, Audrey P; Raniga, Parnesh; Barnes, David G; Dowe, David L; Ng, Amanda C L; Egan, Gary F

    2017-01-01

    Purpose: Quantitative susceptibility mapping (QSM) enables cerebral venous characterization and physiological measurements, such as oxygen extraction fraction (OEF). The exquisite sensitivity of QSM to deoxygenated blood makes it possible to image small veins; however partial volume effects must be addressed for accurate quantification. We present a new method, Iterative Cylindrical Fitting (ICF), to estimate voxel-based partial volume effects for susceptibility maps and use it to improve OEF quantification of small veins with diameters between 1.5 and 4 voxels. Materials and Methods: Simulated QSM maps were generated to assess the performance of the ICF method over a range of vein geometries with varying echo times and noise levels. The ICF method was also applied to in vivo human brain data to assess the feasibility and behavior of OEF measurements compared to the maximum intensity voxel (MIV) method. Results: Improved quantification of OEF measurements was achieved for vessels with contrast to noise greater than 3.0 and vein radii greater than 0.75 voxels. The ICF method produced improved quantitative accuracy of OEF measurement compared to the MIV approach (mean OEF error 7.7 vs. 12.4%). The ICF method provided estimates of vein radius (mean error <27%) and partial volume maps (root mean-squared error <13%). In vivo results demonstrated consistent estimates of OEF along vein segments. Conclusion: OEF quantification in small veins (1.5-4 voxels in diameter) had lower error when using partial volume estimates from the ICF method.

  11. Improved Quantification of Cerebral Vein Oxygenation Using Partial Volume Correction

    PubMed Central

    Ward, Phillip G. D.; Fan, Audrey P.; Raniga, Parnesh; Barnes, David G.; Dowe, David L.; Ng, Amanda C. L.; Egan, Gary F.

    2017-01-01

    Purpose: Quantitative susceptibility mapping (QSM) enables cerebral venous characterization and physiological measurements, such as oxygen extraction fraction (OEF). The exquisite sensitivity of QSM to deoxygenated blood makes it possible to image small veins; however partial volume effects must be addressed for accurate quantification. We present a new method, Iterative Cylindrical Fitting (ICF), to estimate voxel-based partial volume effects for susceptibility maps and use it to improve OEF quantification of small veins with diameters between 1.5 and 4 voxels. Materials and Methods: Simulated QSM maps were generated to assess the performance of the ICF method over a range of vein geometries with varying echo times and noise levels. The ICF method was also applied to in vivo human brain data to assess the feasibility and behavior of OEF measurements compared to the maximum intensity voxel (MIV) method. Results: Improved quantification of OEF measurements was achieved for vessels with contrast to noise greater than 3.0 and vein radii greater than 0.75 voxels. The ICF method produced improved quantitative accuracy of OEF measurement compared to the MIV approach (mean OEF error 7.7 vs. 12.4%). The ICF method provided estimates of vein radius (mean error <27%) and partial volume maps (root mean-squared error <13%). In vivo results demonstrated consistent estimates of OEF along vein segments. Conclusion: OEF quantification in small veins (1.5–4 voxels in diameter) had lower error when using partial volume estimates from the ICF method. PMID:28289372

  12. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  13. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, Massimo; Palmiotti, Giuseppe; Aliberti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  14. Detection and Quantification of Magnetically Labeled Cells by Cellular MRI

    PubMed Central

    Liu, Wei; Frank, Joseph A.

    2008-01-01

    Labeling cells with superparamagnetic iron oxide (SPIO) nanoparticles, paramagnetic contrast agent (gadolinium) or perfluorocarbons allows for the possibility of tracking single or clusters of labeled cells within target tissues following either direct implantation or intravenous injection. This review summarizes the practical issues regarding detection and quantification of magnetically labeled cells with various MRI contrast agents with a focus on SPIO nanoparticles. PMID:18995978

  15. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    PubMed Central

    Roces, Carla B.; Kastner, Elisabeth; Stone, Peter; Lowry, Deborah; Perrie, Yvonne

    2016-01-01

    Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics). The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC), cholesterol, dimethyldioctadecylammonium (DDA) bromide, and d-(+)-trehalose 6,6′-dibehenate (TDB). The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested). The corresponding limit of detection (LOD) and limit of quantification (LOQ) were 0.11 and 0.36 mg/mL (DMPC), 0.02 and 0.80 mg/mL (cholesterol), 0.06 and 0.20 mg/mL (DDA), and 0.05 and 0.16 mg/mL (TDB), respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes. PMID:27649231

  16. A Subspace Approach to Spectral Quantification for MR Spectroscopic Imaging.

    PubMed

    Li, Yudu; Lam, Fan; Clifford, Bryan; Liang, Zhi-Pei

    2017-08-18

    To provide a new approach for incorporating both spatial and spectral priors into the solution of the spectral quantification problem for magnetic resonance spectroscopic imaging (MRSI). A novel signal model is proposed, which represents the spectral distributions of each molecule as a subspace and the entire spectrum as a union-of-subspaces. Based on this model, the spectral quantification can be solved in two steps: a) subspace estimation based on the empirical distributions of the spectral parameters estimated using spectral priors, and b) parameter estimation for the union-of-subspaces model incorporating spatial priors. The proposed method has been evaluated using both simulated and experimental data, producing impressive results. The proposed union-of-subspaces representation of spatiospectral functions provides an effective computational framework for solving the MRSI spectral quantification problem with spatiospectral constraints. The proposed approach transforms how the MRSI spectral quantification problem is solved and enables efficient and effective use of spatiospectral priors to improve parameter estimation. The resulting algorithm is expected to be useful for a wide range of quantitative metabolic imaging studies using MRSI.

  17. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  18. Droplet digital PCR for absolute quantification of pathogens.

    PubMed

    Gutiérrez-Aguirre, Ion; Rački, Nejc; Dreo, Tanja; Ravnikar, Maja

    2015-01-01

    The recent advent of different digital PCR (dPCR) platforms is enabling the expansion of this technology for research and diagnostic applications worldwide. The main principle of dPCR, as in other PCR-based methods including quantitative PCR (qPCR), is the specific amplification of a nucleic acid target. The distinctive feature of dPCR is the separation of the reaction mixture into thousands to millions of partitions which is followed by a real time or end point detection of the amplification. The distribution of target sequences into partitions is described by the Poisson distribution, thus allowing accurate and absolute quantification of the target from the ratio of positive against all partitions at the end of the reaction. This omits the need to use reference materials with known target concentrations and increases the accuracy of quantification at low target concentrations compared to qPCR. dPCR has also shown higher resilience to inhibitors in a number of different types of samples. In this chapter we describe the droplet digital PCR (ddPCR) workflow for the detection and quantification of pathogens using the droplet digital Bio-Rad platform QX100. We present as an example the quantification of the quarantine plant pathogenic bacterium, Erwinia amylovora.

  19. Colorimetric Quantification and in Situ Detection of Collagen

    ERIC Educational Resources Information Center

    Esteban, Francisco J.; del Moral, Maria L.; Sanchez-Lopez, Ana M.; Blanco, Santos; Jimenez, Ana; Hernandez, Raquel; Pedrosa, Juan A.; Peinado, Maria A.

    2005-01-01

    A simple multidisciplinary and inexpensive laboratory exercise is proposed, in which the undergraduate student may correlate biochemical and anatomical findings. The entire practical session can be completed in one 2.5-3 hour laboratory period, and consists of the quantification of collagen and total protein content from tissue sections--without…

  20. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    PubMed

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  1. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood

    PubMed Central

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji-Youn; Yun, Sun Ae; Lee, Myoung-Keun; Lee, Nam Yong; Kim, Jong-Won

    2017-01-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 106 IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log10 copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB. PMID:28029001

  2. Proteomics of Microparticles with SILAC Quantification (PROMIS-Quan): A Novel Proteomic Method for Plasma Biomarker Quantification*

    PubMed Central

    Harel, Michal; Oren-Giladi, Pazit; Kaidar-Person, Orit; Shaked, Yuval; Geiger, Tamar

    2015-01-01

    Unbiased proteomic analysis of plasma samples holds the promise to reveal clinically invaluable disease biomarkers. However, the tremendous dynamic range of the plasma proteome has so far hampered the identification of such low abundant markers. To overcome this challenge we analyzed the plasma microparticle proteome, and reached an unprecedented depth of over 3000 plasma proteins in single runs. To add a quantitative dimension, we developed PROMIS-Quan—PROteomics of MIcroparticles with Super-Stable Isotope Labeling with Amino Acids in Cell Culture (SILAC) Quantification, a novel mass spectrometry-based technology for plasma microparticle proteome quantification. PROMIS-Quan enables a two-step relative and absolute SILAC quantification. First, plasma microparticle proteomes are quantified relative to a super-SILAC mix composed of cell lines from distinct origins. Next, the absolute amounts of selected proteins of interest are quantified relative to the super-SILAC mix. We applied PROMIS-Quan to prostate cancer and compared plasma microparticle samples of healthy individuals and prostate cancer patients. We identified in total 5374 plasma-microparticle proteins, and revealed a predictive signature of three proteins that were elevated in the patient-derived plasma microparticles. Finally, PROMIS-Quan enabled determination of the absolute quantitative changes in prostate specific antigen (PSA) upon treatment. We propose PROMIS-Quan as an innovative platform for biomarker discovery, validation, and quantification in both the biomedical research and in the clinical worlds. PMID:25624350

  3. Statistical challenges in the quantification of gunshot residue evidence.

    PubMed

    Gauriot, Romain; Gunaratnam, Lawrence; Moroni, Rossana; Reinikainen, Tapani; Corander, Jukka

    2013-09-01

    The discharging of a gun results in the formation of extremely small particles known as gunshot residues (GSR). These may be deposited on the skin and clothing of the shooter, on other persons present, and on nearby items or surfaces. Several factors and their complex interactions affect the number of detectable GSR particles, which can deeply influence the conclusions drawn from likelihood ratios or posterior probabilities for prosecution hypotheses of interest. We present Bayesian network models for casework examples and demonstrate that probabilistic quantification of GSR evidence can be very sensitive to the assumptions concerning the model structure, prior probabilities, and the likelihood components. This finding has considerable implications for the use of statistical quantification of GSR evidence in the legal process.

  4. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  5. Good quantification practices of flavours and fragrances by mass spectrometry

    PubMed Central

    Begnaud, Frédéric

    2016-01-01

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644977

  6. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  7. Experimental validation of 2D uncertainty quantification for DIC.

    SciTech Connect

    Reu, Phillip L.

    2010-06-01

    Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual test images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.

  8. Multiscale recurrence quantification analysis of order recurrence plots

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian; Lin, Aijing

    2017-03-01

    In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.

  9. Quantification of viable helminth eggs in samples of sewage sludge.

    PubMed

    Rocha, Maria Carolina Vieira da; Barés, Monica Eboly; Braga, Maria Cristina Borba

    2016-10-15

    For the application of sewage sludge as fertilizer, it is of fundamental importance the absence of pathogenic organisms, such as viable helminth eggs. Thus, the quantification of these organisms has to be carried out by means of the application of reliable and accurate methodologies. Nevertheless, until the present date, there is no consensus with regard to the adoption of a universal methodology for the detection and quantification of viable helminth eggs. It is therefore necessary to instigate a debate on the different protocols currently in use, as well as to assemble relevant information in order to assist in the development of a more comprehensive and accurate method to quantify viable helminth eggs in samples of sewage sludge and its derivatives. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    SciTech Connect

    Ghattas, Omar

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  11. Standardless Quantification of Heavy Elements by Electron Probe Microanalysis.

    PubMed

    Moy, Aurélien; Merlet, Claude; Dugne, Olivier

    2015-08-04

    Absolute Mα and Mβ X-ray intensities were measured for the elements Pt, Au, Pb, U, and Th by electron impact for energies ranging from 6 to 38 keV. Experimental data were obtained by measuring the X-ray intensity emitted from bulk samples with an electron microprobe using high-resolution wavelength-dispersive spectrometers. Recorded X-ray intensities were converted into absolute X-ray yields by evaluation of the detector efficiency and then compared with X-ray intensities calculated by means of Monte Carlo simulations. Simulated Mα and Mβ X-ray intensities were found to be in good agreement with the measurements, allowing their use in standardless quantification methods. A procedure and a software program were developed to accurately obtain virtual standard values. Standardless quantifications of Pb and U were tested on standards of PbS, PbTe, PbCl2, vanadinite, and UO2.

  12. Uncertainty Quantification and Validation for RANS Turbulence Models

    NASA Astrophysics Data System (ADS)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  13. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Neal, Douglas R.; Smith, Barton L.; Warner, Scott O.; Vlachos, Pavlos P.; Wieneke, Bernhard; Scarano, Fulvio

    2015-07-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes.

  14. Bquant - Novel script for batch quantification of LCMS data.

    PubMed

    Rožman, Marko; Petrović, Mira

    2016-01-01

    Quantitative target analysis by liquid chromatography coupled to mass spectrometry (LCMS) is ubiquitous in environmental, metabolomic and toxicological studies. Targeted LCMS methods are capable of the simultaneous determination of literally hundreds of analytes. Although acquiring of instrumental data is very fast, data post-processing i.e. quantification can be time consuming step (and)or dependent to various commercial software packages. In attempt to facilitate this drawback Wolfram Mathematica script for batch quantification of LCMS data was created. Script works with direct outputs of integration algorithms created by different instrument control software's or custom created outputs. Key benefits of Bquant script are: •simple and automated routine for batch mode quantification•vast improvement in processing time (especially compared to manual interpretation)•data can be quickly re-analysed using different inputs Script was validated on various datasets and some of these were provided as working examples.

  15. Metering error quantification under voltage and current waveform distortion

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  16. Analytical Methods for the Quantification of Histamine and Histamine Metabolites.

    PubMed

    Bähre, Heike; Kaever, Volkhard

    2017-03-21

    The endogenous metabolite histamine (HA) is synthesized in various mammalian cells but can also be ingested from exogenous sources. It is involved in a plethora of physiological and pathophysiological processes. So far, four different HA receptors (H1R-H4R) have been described and numerous HAR antagonists have been developed. Contemporary investigations regarding the various roles of HA and its main metabolites have been hampered by the lack of highly specific and sensitive analytic methods for all of these analytes. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) is the method of choice for identification and sensitive quantification of many low-molecular weight endogenous metabolites. In this chapter, different methodological aspects of HA quantification as well as recommendations for LC-MS/MS methods suitable for analysis of HA and its main metabolites are summarized.

  17. Feature quantification and abnormal detection on cervical squamous epithelial cells.

    PubMed

    Zhao, Mingzhu; Chen, Lei; Bian, Linjie; Zhang, Jianhua; Yao, Chunyan; Zhang, Jianwei

    2015-01-01

    Feature analysis and classification detection of abnormal cells from images for pathological analysis are an important issue for the realization of computer assisted disease diagnosis. This paper studies a method for cervical squamous epithelial cells. Based on cervical cytological classification standard and expert diagnostic experience, expressive descriptors are extracted according to morphology, color, and texture features of cervical scales epithelial cells. Further, quantificational descriptors related to cytopathology are derived as well, including morphological difference degree, cell hyperkeratosis, and deeply stained degree. The relationship between quantified value and pathological feature can be established by these descriptors. Finally, an effective method is proposed for detecting abnormal cells based on feature quantification. Integrated with clinical experience, the method can realize fast abnormal cell detection and preliminary cell classification.

  18. Quantification of Trace Chemicals Using Vehicle Cabin Atmosphere Monitor

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Mandrake, Lukas; Bornstein, Benjamin; Bue, Brian

    2009-01-01

    A system to monitor the concentrations of trace chemicals in cabin atmosphere is one of the most critical components in long-duration human flight missions. The Vehicle Cabin Atmosphere Monitor (VCAM) is a miniature gas chromatograph mass spectrometer system to be used to detect and quantify trace chemicals in the International Space Station. We developed an autonomous computational process to quantify trace chemicals for use in VCAM. The process involves the design of a measured signal quantification scheme, the construction of concentration curves (i.e. the relationship between concentration and ion count measured by VCAM), the decision rule of applying high- or low-gain concentration curves, and the detection of saturation, low-signals, and outliers. When the developed quantification process is applied, the average errors of concentration for most of trace chemicals are found to be between 14% and 66%.

  19. Targeted Quantification of the Glycated Peptides of Human Serum Albumin.

    PubMed

    Vannuruswamy, Garikapati; Korwar, Arvind M; Jagadeeshaprasad, Mashanipalya G; Kulkarni, Mahesh J

    2017-01-01

    Glycated human serum albumin (HSA) serves as an important marker for monitoring the glycemic status. Developing methods for unambiguous identification and quantification of glycated peptides of HSA using high-throughput technologies such as mass spectrometry has a great clinical significance. The following protocol describes the construction of reference spectral libraries for Amadori-modified lysine (AML), N(ε)-(carboxymethyl) lysine (CML)-, and N(ε)-(carboxyethyl)lysine (CEL)-modified peptides of synthetically modified HSA using high-resolution mass spectrometers. The protocol also describes work flows, for unambiguous identification and quantification of glycated modified peptides of HSA in clinical plasma using standard spectral libraries by various mass spectrometry approaches such as parallel reaction monitoring (PRM), sequential window acquisition of all theoretical fragment ion spectra (SWATH), and MS(E).

  20. High-throughput quantification of hydroxyproline for determination of collagen.

    PubMed

    Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan

    2011-10-15

    An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline.

  1. The Challenges of Credible Thermal Protection System Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2013-01-01

    The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.

  2. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  3. Improved Quantification of Plasma Catecholamines by the Radioenzymic Kit Method.

    DTIC Science & Technology

    1982-11-01

    RECIPIENT’S CATALOG NUMBER SAM-TR-8 2-40 n &Ie 4. TITLE (and Subtitle) S. TYPE OF REPORT G PERIOD COVERED IMPROVED QUANTIFICATION OF PLASMA CATECHOLAMINES...side i 1nocesCary and Identily by block nmuber) Human/swine plasma catecholamine assays Plasma catecholamines by kit method Enzymic catecholamine...for plasma catecholamine analyses is reported. The departure features use of "mean net standards" instead of individual internal standards

  4. Quantification of intraocular surgery motions with an electromagnetic tracking system.

    PubMed

    Son, Ji; Bourges, Jean-Louis; Culjat, Martin O; Nistor, Vasile; Dutson, Erik P; Carman, Gregory P; Hubschman, Jean Pierre

    2009-01-01

    Motion tracking was performed during a combined phacoemulsification (PKE) and pars plana vitrectomy (PPV) procedure on a pig eyeball. The UCLA Laparoscopic Training System (UCLA-LTS), which consists of electromagnetic sensors attached to the surgical tools to measure three-dimensional spatial vectors, was modified to enable quantification of intraocular surgery motions. The range of motion and time taken to complete the given task were successfully recorded.

  5. Quantification of Thermal Lensing Using an Artificial Eye

    DTIC Science & Technology

    2013-01-01

    CONTRACT NUMBER FA8650-08-6930 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 0603231F 6. AUTHOR(S) Erica L. Towle , John M. Rickman, Andrew K...ANSI Std. 239.18 i IEEE JOURNAL OF SELECTED TOPICS IN QUANTUM ELECTRONICS 1 Quantification of Thermal Lensing Using an Artificial Eye Erica L. Towle ...2013; accepted April 22, 2013. E. L. Towle is with the Air Force Research Laboratory, Fort Sam Houston, TX 78234 USA and also with the Biomedical

  6. Semantics and Quantification in Natural Language Question Answering

    DTIC Science & Technology

    1977-11-01

    not universally so (for example, when a function is applied to a quantified noun phrase - see Functional Nesting below). In situations where the...query languages, question-answering, semantic interpretation, semantic rules, syntax - semantics interaction. 20. ABSTRACT (Confinua on ravaraa...Language 17 4.1 Designators 18 4.2 Propositions 18 4.3 Commands 19 4.4 Quantification 19 4.5 Specification of the MRL Syntax 21 4.6 Procedural

  7. Universal Quantification in a Constraint-Based Planner

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.

  8. Color correction for automatic fibrosis quantification in liver biopsy specimens

    PubMed Central

    Murakami, Yuri; Abe, Tokiya; Hashiguchi, Akinori; Yamaguchi, Masahiro; Saito, Akira; Sakamoto, Michiie

    2013-01-01

    Context: For a precise and objective quantification of liver fibrosis, quantitative evaluations through image analysis have been utilized. However, manual operations are required in most cases for extracting fiber areas because of color variation included in digital pathology images. Aims: The purpose of this research is to propose a color correction method for whole slide images (WSIs) of Elastica van Gieson (EVG) stained liver biopsy tissue specimens and to realize automated operation of image analysis for fibrosis quantification. Materials and Methods: Our experimental dataset consisted of 38 WSIs of liver biopsy specimens collected from 38 chronic viral hepatitis patients from multiple medical facilities, stained with EVG and scanned at ×20 using a Nano Zoomer 2.0 HT (Hamamatsu Photonics K.K., Hamamatsu, Japan). Color correction was performed by modifying the color distribution of a target WSI so as to fit to the reference, where the color distribution was modeled by a set of two triangle pyramids. Using color corrected WSIs; fibrosis quantification was performed based on tissue classification analysis. Statistical Analysis Used: Spearman's rank correlation coefficients were calculated between liver stiffness measured by transient elastography and median area ratio of collagen fibers calculated based on tissue classification results. Results: Statistical analysis results showed a significant correlation r = 0.61-0.68 even when tissue classifiers were trained by using a subset of WSIs, while the correlation coefficients were reduced to r = 0.40-0.50 without color correction. Conclusions: Fibrosis quantification accompanied with the proposed color correction method could provide an objective evaluation tool for liver fibrosis, which complements semi-quantitative histologic evaluation systems. PMID:24524002

  9. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  10. A refined methodology for modeling volume quantification performance in CT

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  11. 3-Nitrotyrosine quantification methods: Current concepts and future challenges.

    PubMed

    Teixeira, Dulce; Fernandes, Rúben; Prudêncio, Cristina; Vieira, Mónica

    2016-06-01

    Measurement of 3-nitrotyrosine (3-NT) in biological samples can be used as a biomarker of nitrosative stress, since it is very stable and suitable for analysis. Increased 3-NT levels in biological samples have been associated with several physiological and pathological conditions. Different methods have been described for the detection and quantification of this molecule, such as (i) immunological methods; (ii) liquid chromatography, namely high-pressure liquid chromatography (HPLC)-based methods that use ultraviolet-visible (UV/VIS) absorption, electrochemical (ECD) and diode array (DAD) detection, liquid chromatography-mass spectrometry (LC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS); (iii) gas chromatography, such as gas chromatography-mass spectrometry (GC-MS) and gas chromatography-tandem mass spectrometry (GC-MS/MS). A literature review on nitrosative stress, protein nitration, as well as 3-NT quantification methods was carried out. This review covers the different methods for analysis of 3-NT that have been developed during the last years as well as the latest advances in this field. Overall, all methods present positive and negative aspects, although it is clear that chromatography-based methods present good sensitivity and specificity. Regarding this, GC-based methods exhibit the highest sensibility in the quantification of 3-NT, although it requires a prior time consuming derivatization step. Conversely, HPLC does not require such derivatization step, despite being not as accurate as GC. It becomes clear that all the methods described during this literature review, although accurate for 3-NT quantification, need to be improved regarding both sensitivity and specificity. Moreover, optimization of the protocols that have been described is clearly needed. Copyright © 2016 Elsevier B.V. and Société Française de Biochimie et Biologie Moléculaire (SFBBM). All rights reserved.

  12. Incorporating Functional Gene Quantification into Traditional Decomposition Models

    NASA Astrophysics Data System (ADS)

    Todd-Brown, K. E.; Zhou, J.; Yin, H.; Wu, L.; Tiedje, J. M.; Schuur, E. A. G.; Konstantinidis, K.; Luo, Y.

    2014-12-01

    Incorporating new genetic quantification measurements into traditional substrate pool models represents a substantial challenge. These decomposition models are built around the idea that substrate availablity, with environmental drivers, limit carbon dioxide respiration rates. In this paradigm, microbial communities optimally adapt to a given substrate and environment on much shorter time scales then the carbon flux of interest. By characterizing the relative shift in biomass of these microbial communities, we informed previously poorly constrained parameters in traditional decomposition models. In this study we coupled a 9 month laboratory incubation study with quantitative gene measurements with traditional CO2 flux measurements plus initial soil organic carbon quantification. GeoChip 5.0 was used to quantify the functional genes associated with carbon cycling at 2 weeks, 3 months and 9 months. We then combined the genes which 'collapsed' over the experiment and assumed that this tracked the relative change in the biomass associated with the 'fast' pool. We further assumed that this biomass was proportional to the 'fast' SOC pool and thus were able to constrain the relative change in the fast SOC pool in our 3-pool decomposition model. We found that biomass quantification described above, combined with traditional CO2 flux and SOC measurements, improve the transfer coefficient estimation in traditional decomposition models. Transfer coefficients are very difficult to characterized using traditional CO2 flux measurements, thus DNA quantification provides new and significant information about the system. Over a 100 year simulation, these new biologically informed parameters resulted in an additional 10% of SOC loss over the traditionally informed parameters.

  13. Quantification Of Margins And Uncertainties: A Bayesian Approach (full Paper)

    SciTech Connect

    Wallstrom, Timothy C

    2008-01-01

    Quantification of Margins and Uncertainties (QMU) is 'a formalism for dealing with the reliability of complex technical systems, and the confidence which can be placed in estimates of that reliability.' (Eardleyet al, 2005). In this paper, we show how QMU may be interpreted in the framework of Bayesian statistical inference, using a probabilistic network. The Bayesian approach clarifies the probabilistic underpinnings of the formalism, and shows how the formalism can be used for deciSion-making.

  14. Quantification of jasmonic and salicylic acids in rice seedling leaves.

    PubMed

    Cho, Kyoungwon; Han, Oksoo; Tamogami, Shigeru; Shibato, Junko; Kubo, Akihiro; Agrawal, Ganesh Kumar; Rakwal, Randeep

    2013-01-01

    Jasmonic acid (JA) and salicylic acid (SA) are critical signaling components involved in various aspects of plant growth, development, and defense. Their constitutive levels vary from plant to plant and also from tissue to tissue within the same plant. Moreover, their quantitative levels change when plant is exposed to biotic and abiotic stresses. To better understand the JA- and SA-mediated signaling and metabolic pathways, it is important to precisely quantify their levels in plants/tissues/organs. However, their extraction and quantification are not trivial and still technically challenging. An effort has been made in various laboratories to develop a simple and standard procedure that can be utilized for quantification of JA and SA. Here, we present the experimental procedure and our decade of experience on extracting and quantifying them in an absolute manner in leaves of rice seedlings. We must mention that this method has been applied to both monocotyledonous and dicotyledonous plants for absolute quantification of JA and SA. As collaboration is the key towards rapid progress in science and technology, we are always open to sharing our experience in this field with any active research group with an aim to improve the procedure further and eventually to connect the importance of their (JA and SA) quantitative levels with networks of signaling and metabolic pathways in plants.

  15. Qualification and quantification of fish protein in prepared surimi crabstick.

    PubMed

    Reed, Z H; Park, J W

    2008-06-01

    Species identification and protein quantification in surimi crabstick were achieved using sodium dodecyl-sulfate polyacrylamide gel electrophoresis (SDS-PAGE). When the Lowry and Kjeldahl protein determination methods were compared, the former showed more consistent results. Densitometric scanning of the gels was used for quantification of total fish protein as well as total egg white protein. The lower molecular weight proteins, 30 kDa and lower, proved to be the most useful in fish species identification as well as egg white protein addition. Using a combination of the myosin heavy chain band and the species-specific myosin light chain (Alaska pollock: 22.5 kDa; Pacific whiting: 24.4 kDa) proved the most accurate in calculating fish protein content of the crabstick sample, while for those samples that contained egg white, quantification was accomplished from the densitometric analysis of the overlapping bands of actin (45 kDa) from fish and ovalbumin from egg white. Lysozyme (14.3 kDa) proved to be a unique protein band in determining the presence of egg white when the content of dried egg white was equal to or exceeded 0.5% of the total weight of the final crabstick.

  16. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  17. Analytical strategies for the global quantification of intact proteins.

    PubMed

    Collier, Timothy S; Muddiman, David Charles

    2012-09-01

    The quantification of intact proteins is a relatively recent development in proteomics. In eukaryotic organisms, proteins are present as multiple isoforms as the result of variations in genetic code, alternative splicing, post-translational modification and other processing events. Understanding the identities and biological functions of these isoforms and how their concentrations vary across different states is the central goal of proteomics. To date, the bulk of proteomics research utilizes a "bottom-up" approach, digesting proteins into their more manageable constitutive peptides, but sacrificing information about the specific isoform and combinations of post-translational modifications present on the protein. Very specific strategies for protein quantification such as the enzyme-linked immunosorbent assay and Western blot are commonplace in laboratories and clinics, but impractical for the study of global biological changes. Herein, we describe strategies for the quantification of intact proteins, their distinct advantages, and challenges to their employment. Techniques contained in this review include the more traditional and widely employed methodology of differential gel electrophoresis and more recently developed mass spectrometry-based techniques including metabolic labeling, chemical labeling, and label-free methodologies.

  18. Gas plume quantification in downlooking hyperspectral longwave infrared images

    NASA Astrophysics Data System (ADS)

    Turcotte, Caroline S.; Davenport, Michael R.

    2010-10-01

    Algorithms have been developed to support quantitative analysis of a gas plume using down-looking airborne hyperspectral long-wave infrared (LWIR) imagery. The resulting gas quantification "GQ" tool estimates the quantity of one or more gases at each pixel, and estimates uncertainty based on factors such as atmospheric transmittance, background clutter, and plume temperature contrast. GQ uses gas-insensitive segmentation algorithms to classify the background very precisely so that it can infer gas quantities from the differences between plume-bearing pixels and similar non-plume pixels. It also includes MODTRAN-based algorithms to iteratively assess various profiles of air temperature, water vapour, and ozone, and select the one that implies smooth emissivity curves for the (unknown) materials on the ground. GQ then uses a generalized least-squares (GLS) algorithm to simultaneously estimate the most likely mixture of background (terrain) material and foreground plume gases. Cross-linking of plume temperature to the estimated gas quantity is very non-linear, so the GLS solution was iteratively assessed over a range of plume temperatures to find the best fit to the observed spectrum. Quantification errors due to local variations in the camera-topixel distance were suppressed using a subspace projection operator. Lacking detailed depth-maps for real plumes, the GQ algorithm was tested on synthetic scenes generated by the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software. Initial results showed pixel-by-pixel gas quantification errors of less than 15% for a Freon 134a plume.

  19. Leveraging transcript quantification for fast computation of alternative splicing profiles

    PubMed Central

    Alamancos, Gael P.; Pagès, Amadís; Trincado, Juan L.; Bellora, Nicolás; Eyras, Eduardo

    2015-01-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. PMID:26179515

  20. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    PubMed Central

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  1. Efficient inversion and uncertainty quantification of a tephra fallout model

    NASA Astrophysics Data System (ADS)

    White, J. T.; Connor, C. B.; Connor, L.; Hasenaka, T.

    2017-01-01

    An efficient and effective inversion and uncertainty quantification approach is proposed for estimating eruption parameters given a data set collected from a tephra deposit. The approach is model independent and here is applied using Tephra2, a code that simulates advective and dispersive tephra transport and deposition. The Levenburg-Marquardt algorithm is combined with formal Tikhonov and subspace regularization to invert eruption parameters; a linear equation for conditional uncertainty propagation is used to estimate posterior parameter uncertainty. Both the inversion and uncertainty analysis support simultaneous analysis of the full eruption and wind field parameterization. The combined inversion/uncertainty quantification approach is applied to the 1992 eruption of Cerro Negro and the 2011 Kirishima-Shinmoedake eruption. While eruption mass uncertainty is reduced by inversion against tephra isomass data, considerable uncertainty remains for many eruption and wind field parameters, such as plume height. Supplementing the inversion data set with tephra granulometry data is shown to further reduce the uncertainty of most eruption and wind field parameters. The eruption mass of the 2011 Kirishima-Shinmoedake eruption is 0.82 × 1010 kg to 2.6 × 1010 kg, with 95% confidence; total eruption mass for the 1992 Cerro Negro eruption is 4.2 × 1010 kg to 7.3 × 1010 kg, with 95% confidence. These results indicate that eruption classification and characterization of eruption parameters can be significantly improved through this uncertainty quantification approach.

  2. Theoretical limitations of quantification for noncompetitive sandwich immunoassays.

    PubMed

    Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom

    2015-11-01

    Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations.

  3. Initial water quantification results using neutron computed tomography

    NASA Astrophysics Data System (ADS)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  4. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    PubMed

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Simultaneous quantification of sialyloligosaccharides from human milk by capillary electrophoresis

    PubMed Central

    Bao, Yuanwu; Zhu, Libin; Newburg, David S.

    2007-01-01

    The acidic oligosaccharides of human milk are predominantly sialyloligosaccharides. Pathogens that bind sialic acid-containing glycans on their host mucosal surfaces may be inhibited by human milk sialyloligosaccharides, but testing this hypothesis requires their reliable quantification in milk. Sialyloligosaccharides have been quantified by anion exchange HPLC, reverse or normal phase HPLC, and capillary electrophoresis (CE) of fluorescent derivatives; in milk, these oligosaccharides have been analyzed by high pH anion exchange chromatography with pulsed amperometric detection, and, in our laboratory, by CE with detection at 205 nm. The novel method described herein uses a running buffer of aqueous 200 mM NaH2PO4 at pH 7.05 containing 100 mM SDS made 45% (v/v) with methanol to baseline resolve five oligosaccharides, and separate all 12. This allows automated simultaneous quantification of the 12 major sialyloligosaccharides of human milk in a single 35-minute run. This method revealed differences in sialyloligosaccharide concentrations between less and more mature milk from the same donors. Individual donors also varied in expression of sialyloligosaccharides in their milk. Thus, the facile quantification of sialyloligosaccharides by this method is suitable for measuring variation in expression of specific sialyloligosaccharides in milk and their relationship to decreased risk of specific diseases in infants. PMID:17761135

  6. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  7. Consistency of flow quantifications in tridirectional phase-contrast MRI

    NASA Astrophysics Data System (ADS)

    Unterhinninghofen, R.; Ley, S.; Dillmann, R.

    2009-02-01

    Tridirectionally encoded phase-contrast MRI is a technique to non-invasively acquire time-resolved velocity vector fields of blood flow. These may not only be used to analyze pathological flow patterns, but also to quantify flow at arbitrary positions within the acquired volume. In this paper we examine the validity of this approach by analyzing the consistency of related quantifications instead of comparing it with an external reference measurement. Datasets of the thoracic aorta were acquired from 6 pigs, 1 healthy volunteer and 3 patients with artificial aortic valves. Using in-house software an elliptical flow quantification plane was placed manually at 6 positions along the descending aorta where it was rotated to 5 different angles. For each configuration flow was computed based on the original data and data that had been corrected for phase offsets. Results reveal that quantifications are more dependent on changes in position than on changes in angle. Phase offset correction considerably reduces this dependency. Overall consistency is good with a maximum variation coefficient of 9.9% and a mean variation coefficient of 7.2%.

  8. Quantification of Lung PET Images: Challenges and Opportunities.

    PubMed

    Chen, Delphine L; Cheriyan, Joseph; Chilvers, Edwin R; Choudhury, Gourab; Coello, Christopher; Connell, Martin; Fisk, Marie; Groves, Ashley M; Gunn, Roger N; Holman, Beverley F; Hutton, Brian F; Lee, Sarah; MacNee, William; Mohan, Divya; Parr, David; Subramanian, Deepak; Tal-Singer, Ruth; Thielemans, Kris; van Beek, Edwin J R; Vass, Laurence; Wellen, Jeremy W; Wilkinson, Ian; Wilson, Frederick J

    2017-02-01

    Millions of people are affected by respiratory diseases, leading to a significant health burden globally. Because of the current insufficient knowledge of the underlying mechanisms that lead to the development and progression of respiratory diseases, treatment options remain limited. To overcome this limitation and understand the associated molecular changes, noninvasive imaging techniques such as PET and SPECT have been explored for biomarker development, with (18)F-FDG PET imaging being the most studied. The quantification of pulmonary molecular imaging data remains challenging because of variations in tissue, air, blood, and water fractions within the lungs. The proportions of these components further differ depending on the lung disease. Therefore, different quantification approaches have been proposed to address these variabilities. However, no standardized approach has been developed to date. This article reviews the data evaluating (18)F-FDG PET quantification approaches in lung diseases, focusing on methods to account for variations in lung components and the interpretation of the derived parameters. The diseases reviewed include acute respiratory distress syndrome, chronic obstructive pulmonary disease, and interstitial lung diseases such as idiopathic pulmonary fibrosis. Based on review of prior literature, ongoing research, and discussions among the authors, suggested considerations are presented to assist with the interpretation of the derived parameters from these approaches and the design of future studies.

  9. Improving quantification of intravascular fluorescence imaging using structural information

    NASA Astrophysics Data System (ADS)

    Mallas, Georgios; Brooks, Dana H.; Rosenthal, Amir; Nika Nudelman, R.; Mauskapf, Adam; Jaffer, Farouc A.; Ntziachristos, Vasilis

    2012-10-01

    Intravascular near-infrared fluorescence (iNIRF) imaging can enable the in vivo visualization of biomarkers of vascular pathology, including high-risk plaques. The technique resolves the bio-distribution of systemically administered fluorescent probes with molecular specificity in the vessel wall. However, the geometrical variations that may occur in the distance between fibre-tip and vessel wall can lead to signal intensity variations and challenge quantification. Herein we examined whether the use of anatomical information of the cross-section vessel morphology, obtained from co-registered intravascular ultrasound (IVUS), can lead to quantification improvements when fibre-tip and vessel wall distance variations are present. The algorithm developed employs a photon propagation model derived from phantom experiments that is used to calculate the relative attenuation of fluorescence signals as they are collected over 360° along the vessel wall, and utilizes it to restore accurate fluorescence readings. The findings herein point to quantification improvements when employing hybrid iNIRF, with possible implications to the clinical detection of high-risk plaques or blood vessel theranostics.

  10. Automatic quantification of neo-vasculature from micro-CT

    NASA Astrophysics Data System (ADS)

    Mallya, Yogish; Narayanan, A. K.; Zagorchev, Lyubomir

    2009-02-01

    Angiogenesis is the process of formation of new blood vessels as outgrowths of pre-existing ones. It occurs naturally during development, tissue repair, and abnormally in pathologic diseases such as cancer. It is associated with proliferation of blood vessels/tubular sprouts that penetrate deep into tissues to supply nutrients and remove waste products. The process starts with migration of endothelial cells. As the cells move towards the target area they form small tubular sprouts recruited from the parent vessel. The sprouts grow in length due to migration, proliferation, and recruitment of new endothelial cells and the process continues until the target area becomes fully vascular. Accurate quantification of sprout formation is very important for evaluation of treatments for ischemia as well as angiogenesis inhibitors and plays a key role in the battle against cancer. This paper presents a technique for automatic quantification of newly formed blood vessels from Micro-CT volumes of tumor samples. A semiautomatic technique based on interpolation of Bezier curves was used to segment out the cancerous growths. Small vessels as determined by their diameter within the segmented tumors were enhanced and quantified with a multi-scale 3-D line detection filter. The same technique can be easily extended for quantification of tubular structures in other 3-D medical imaging modalities. Experimental results are presented and discussed.

  11. Nuclear and mitochondrial DNA quantification of various forensic materials.

    PubMed

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  12. Performance of high-throughput DNA quantification methods

    PubMed Central

    Haque, Kashif A; Pfeiffer, Ruth M; Beerman, Michael B; Struewing, Jeff P; Chanock, Stephen J; Bergen, Andrew W

    2003-01-01

    Background The accuracy and precision of estimates of DNA concentration are critical factors for efficient use of DNA samples in high-throughput genotype and sequence analyses. We evaluated the performance of spectrophotometric (OD) DNA quantification, and compared it to two fluorometric quantification methods, the PicoGreen® assay (PG), and a novel real-time quantitative genomic PCR assay (QG) specific to a region at the human BRCA1 locus. Twenty-Two lymphoblastoid cell line DNA samples with an initial concentration of ~350 ng/uL were diluted to 20 ng/uL. DNA concentration was estimated by OD and further diluted to 5 ng/uL. The concentrations of multiple aliquots of the final dilution were measured by the OD, QG and PG methods. The effects of manual and robotic laboratory sample handling procedures on the estimates of DNA concentration were assessed using variance components analyses. Results The OD method was the DNA quantification method most concordant with the reference sample among the three methods evaluated. A large fraction of the total variance for all three methods (36.0–95.7%) was explained by sample-to-sample variation, whereas the amount of variance attributable to sample handling was small (0.8–17.5%). Residual error (3.2–59.4%), corresponding to un-modelled factors, contributed a greater extent to the total variation than the sample handling procedures. Conclusion The application of a specific DNA quantification method to a particular molecular genetic laboratory protocol must take into account the accuracy and precision of the specific method, as well as the requirements of the experimental workflow with respect to sample volumes and throughput. While OD was the most concordant and precise DNA quantification method in this study, the information provided by the quantitative PCR assay regarding the suitability of DNA samples for PCR may be an essential factor for some protocols, despite the decreased concordance and precision of this method

  13. Systematic development of a group quantification method using evaporative light scattering detector for relative quantification of ginsenosides in ginseng products.

    PubMed

    Lee, Gwang Jin; Shin, Byong-Kyu; Yu, Yun-Hyun; Ahn, Jongsung; Kwon, Sung Won; Park, Jeong Hill

    2016-09-05

    The determination for the contents of multi-components in ginseng products has come to the fore by demands of in-depth information, but the associated industries confront the high cost of securing pure standards for the continuous quality evaluation of the products. This study aimed to develop a prospective high-performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method for relative quantification of ginsenosides in ginseng products without a considerable change from the conventional gradient analysis. We investigated the effects of mobile phase composition and elution bandwidth, which are potential variables affecting the ELSD response in the gradient analysis. Similar ELSD response curves of nine major ginsenosides were obtained under the identical flow injection conditions, and the response increased as the percentage of organic solvent increased. The nine ginsenosides were divided into three groups to confirm the effect of elution bandwidth. The ELSD response significantly decreased in case of the late eluted ginsenoside in the individual groups under the isocratic conditions. With the consideration of the two important effects, stepwise changes of the gradient condition were carried out to reach a group quantification method. The inconsistent responses of the nine ginsenosides were reconstituted to three normalized responses by the stepwise changes of the gradient condition, and this result actualized relative quantification in the individual groups. The availability was confirmed by comparing the ginsenoside contents in a base material of ginseng products determined by the direct and group quantification method. The largest difference in the determination results from the two methods was 8.26%, and the difference of total contents was only 0.91%.

  14. Uncertainty quantification for characterization of high enthalpy facilities

    NASA Astrophysics Data System (ADS)

    Villedieu, N.; Cappaert, J.; Garcia Galache, J. P.; Magin, T. E.

    2013-06-01

    The postflight analysis of a space mission requires accurate determination of the free-stream conditions for the trajectory. The Mach number, temperature, and pressure conditions can be rebuilt from the heat flux and pressure measured on the spacecraft by means of a Flush Air Data System (FADS). This instrumentation comprises a set of sensors flush mounted in the thermal protection system to measure the static pressure (pressure taps) and heat flux (calorimeters). Knowing that experimental data suffer from errors, this methodology needs to integrate quantification of uncertainties. Epistemic uncertainties on the models for chemistry in the bulk and at the wall (surface catalysis) should also be taken into account. To study this problem it is necessary to solve a stochastic backward problem. This paper focuses on a preliminary sensitivity analysis of the forward problem to understand which uncertainties need to be accounted for. In section 2, the uncertainty quantification methodologies used in this work are presented. Section 3 is dedicated to the one-dimensional (1D) simulations of the shock layer to identify which chemical reactions of the mechanism need to be accounted for in the Uncertainty Quantification (UQ). After this triage procedure, the two-dimensional (2D) axisymmetric flow around the blunt nose was simulated for two trajectory points of EXPERT (EXPErimental Reentry Test-bed) is simulated and the propagation of the uncertainties on the stagnation pressure and heat flux has been studied. To do this study, the open source software DAKOTA from Sandia National Laboratory [1] is coupled with two in-house codes: SHOCKING that simulates the evolution of the chemical relaxation in the shock layer [2], and COSMIC that simulates axisymmetric chemically reacting flows [3].

  15. Automated epicardial fat volume quantification from non-contrast CT

    NASA Astrophysics Data System (ADS)

    Ding, Xiaowei; Terzopoulos, Demetri; Diaz-Zamudio, Mariana; Berman, Daniel S.; Slomka, Piotr J.; Dey, Damini

    2014-03-01

    Epicardial fat volume (EFV) is now regarded as a significant imaging biomarker for cardiovascular risk strat-ification. Manual or semi-automated quantification of EFV includes tedious and careful contour drawing of pericardium on fine image features. We aimed to develop and validate a fully-automated, accurate algorithm for EVF quantification from non-contrast CT using active contours and multiple atlases registration. This is a knowledge-based model that can segment both the heart and pericardium accurately by initializing the location and shape of the heart in large scale from multiple co-registered atlases and locking itself onto the pericardium actively. The deformation process is driven by pericardium detection, extracting only the white contours repre- senting the pericardium in the CT images. Following this step, we can calculate fat volume within this region (epicardial fat) using standard fat attenuation range. We validate our algorithm on CT datasets from 15 patients who underwent routine assessment of coronary calcium. Epicardial fat volume quantified by the algorithm (69.15 +/- 8.25 cm3) and the expert (69.46 +/- 8.80 cm3) showed excellent correlation (r = 0.96, p < 0.0001) with no significant differences by comparison of individual data points (p = 0.9). The algorithm achieved a Dice overlap of 0.93 (range 0.88 - 0.95). The total time was less than 60 sec on a standard windows computer. Our results show that fast accurate automated knowledge-based quantification of epicardial fat volume from non-contrast CT is feasible. To our knowledge, this is also the first fully automated algorithms reported for this task.

  16. Quantification of breast arterial calcification using full field digital mammography.

    PubMed

    Molloi, Sabee; Xu, Tong; Ducote, Justin; Iribarren, Carlos

    2008-04-01

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K -0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE = 1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  17. Quantification of breast arterial calcification using full field digital mammography

    PubMed Central

    Molloi, Sabee; Xu, Tong; Ducote, Justin; Iribarren, Carlos

    2008-01-01

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K−0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  18. Quantification of breast arterial calcification using full field digital mammography

    SciTech Connect

    Molloi, Sabee; Xu Tong; Ducote, Justin; Iribarren, Carlos

    2008-04-15

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K-0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  19. Quantification of brain endocannabinoid levels: methods, interpretations and pitfalls

    PubMed Central

    Buczynski, Matthew W; Parsons, Loren H

    2010-01-01

    Endocannabinoids play an important role in a diverse range of neurophysiological processes including neural development, neuroimmune function, synaptic plasticity, pain, reward and affective state. This breadth of influence and evidence for altered endocannabinoid signalling in a variety of neuropathologies has fuelled interest in the accurate quantification of these lipids in brain tissue. Established methods for endocannabinoid quantification primarily employ solvent-based lipid extraction with further sample purification by solid phase extraction. In recent years in vivo microdialysis methods have also been developed for endocannabinoid sampling from the brain interstitial space. However, considerable variability in estimates of endocannabinoid content has led to debate regarding the physiological range of concentrations present in various brain regions. This paper provides a critical review of factors that influence the quantification of brain endocannabinoid content as determined by lipid extraction from bulk tissue and by in vivo microdialysis. A variety of methodological issues are discussed including analytical approaches, endocannabinoid extraction and purification, post-mortem changes in brain endocannabinoid content, cellular reactions to microdialysis probe implantation and caveats related to lipid sampling from the extracellular space. The application of these methods for estimating brain endocannabinoid content and the effects of endocannabinoid clearance inhibition are discussed. The benefits, limitations and pitfalls associated with each approach are emphasized, with an eye toward the appropriate interpretation of data gathered by each method. This article is part of a themed issue on Cannabinoids. To view the editorial for this themed issue visit http://dx.doi.org/10.1111/j.1476-5381.2010.00831.x PMID:20590555

  20. Targeted proteomic quantification on quadrupole-orbitrap mass spectrometer.

    PubMed

    Gallien, Sebastien; Duriez, Elodie; Crone, Catharina; Kellmann, Markus; Moehring, Thomas; Domon, Bruno

    2012-12-01

    There is an immediate need for improved methods to systematically and precisely quantify large sets of peptides in complex biological samples. To date protein quantification in biological samples has been routinely performed on triple quadrupole instruments operated in selected reaction monitoring mode (SRM), and two major challenges remain. Firstly, the number of peptides to be included in one survey experiment needs to be increased to routinely reach several hundreds, and secondly, the degree of selectivity should be improved so as to reliably discriminate the targeted analytes from background interferences. High resolution and accurate mass (HR/AM) analysis on the recently developed Q-Exactive mass spectrometer can potentially address these issues. This instrument presents a unique configuration: it is constituted of an orbitrap mass analyzer equipped with a quadrupole mass filter as the front-end for precursor ion mass selection. This configuration enables new quantitative methods based on HR/AM measurements, including targeted analysis in MS mode (single ion monitoring) and in MS/MS mode (parallel reaction monitoring). The ability of the quadrupole to select a restricted m/z range allows one to overcome the dynamic range limitations associated with trapping devices, and the MS/MS mode provides an additional stage of selectivity. When applied to targeted protein quantification in urine samples and benchmarked with the reference SRM technique, the quadrupole-orbitrap instrument exhibits similar or better performance in terms of selectivity, dynamic range, and sensitivity. This high performance is further enhanced by leveraging the multiplexing capability of the instrument to design novel acquisition methods and apply them to large targeted proteomic studies for the first time, as demonstrated on 770 tryptic yeast peptides analyzed in one 60-min experiment. The increased quality of quadrupole-orbitrap data has the potential to improve existing protein

  1. Quantification of Carnosine-Aldehyde Adducts in Human Urine.

    PubMed

    da Silva Bispo, Vanderson; Di Mascio, Paolo; Medeiros, Marisa

    2014-10-01

    Lipid peroxidation generates several reactive carbonyl species, including 4-hydroxy-2-nonenal (HNE), acrolein (ACR), 4-hydroxy-2-hexenal (HHE) and malondialdehyde. One major pathwayof aldehydes detoxification is through conjugation with glutathione catalyzed by glutathione-S-transferases or, alternatively, by conjugation with endogenous histidine containing dipeptides, such as carnosine (CAR). In this study, on-line reverse-phase high-performance liquid chromatography (HPLC) separation with tandem mass spectrometry detection was utilized for the accurate quantification of CAR- ACR, CAR-HHE and CAR-HNE adducts in human urinary samples from non-smokers young adults. Standard adducts were prepared and isolated by HPLC. The results showed the presence of a new product from the reaction of CAR with ACR. This new adduct was completely characterized by HPLC/MS-MSn, 1H RMN, COSY and HSQC. The new HPLC/MS/MS methodology employing stable isotope-labeled internal standards (CAR-HHEd5 and CAR-HNEd11) was developed for adducts quantification. This methodology permits quantification of 10pmol CAR-HHE and 1pmol of CAR-ACR and CAR-HNE. Accurate determinations in human urine sample were performed and showed 4.65±1.71 to CAR-ACR, 5.13±1.76 to CAR-HHE and 5.99±3.19nmol/mg creatinine to CAR-HNE. Our results indicate that carnosine pathways can be an important detoxification route of a, ß -unsaturated aldehydes. Moreover, carnosine adducts may be useful as redox stress indicator. Copyright © 2014. Published by Elsevier Inc.

  2. [Quantification of liver iron concentration using 1-Tesla MRI].

    PubMed

    Alústiza Echeverría, J M; Castiella Eguzkiza, A; Zapata Morcillo, E; Jáuregui Garmendia, L; Gabilondo Aguirregabiria, A; Paloc, C

    2008-01-01

    To evaluate the quantification of liver iron concentration using 1-Tesla magnetic resonance imaging (MRI) and its ability to diagnose or rule out hemochromatosis. To evaluate the role of 1.5-Tesla MRI in inconclusive cases. Between 2002 and 2006, we used 1-Tesla MRI (Gandon method) and liver biopsy to quantify the liver iron concentration in 31 patients. Moreover, we used 1.5-Tesla MRI (according to Alústiza's model) and liver biopsy to determine the liver iron concentration in 10 additional patients and to check the results of 10 patients in whom 1-Tesla MRI detected iron overload. In the first group of 31 patients, liver biopsy classified the liver iron concentration as normal (<36 micromol.Fe/g) in 11 patients, as hemosiderosis (36-80 micromol.Fe/g) in 15, and as hemochromatosis (>80 micromol.Fe/g) in 5. The correlation with the values calculated at MRI was 100% in the 5 cases with hemochromatosis; in the 15 patients with hemosiderosis, 5 were correctly classified and the liver iron concentration was overestimated in 10; of the 11 patients with normal liver iron concentration, 6 were correctly classified and 5 were overestimated. Quantification >80 at MRI has a sensitivity and negative predictive value of 100% and specificity of 50% for the diagnosis of hemochromatosis. Quantification <36 at MRI has a positive predictive value and specificity of 100% to identify the absence of iron overload. In the 10 patients with liver biopsy that underwent 1.5-Tesla MRI, there was a high correlation between the two techniques. The reliability of the evaluation of liver iron concentration using 1-Tesla MRI is useful for ruling out hemochromatosis and identifying patients without iron overload. We observed a tendency to overestimate liver iron concentration in both patients with overload and in those without, and this limits the reliability of the technique. 1.5-Tesla MRI is a good alternative for quantifying liver iron concentration more precisely.

  3. Current peptidomics: Applications, purification, identification, quantification, and functional analysis

    PubMed Central

    Dallas, David C.; Guerrero, Andres; Parker, Evan A.; Robinson, Randall C.; Gan, Junai; German, J. Bruce; Barile, Daniela; Lebrilla, Carlito B.

    2015-01-01

    Peptidomics is an emerging field branching from proteomics that targets endogenously produced protein fragments. Endogenous peptides are often functional within the body—and can be both beneficial and detrimental. This review covers the use of peptidomics in understanding digestion, and identifying functional peptides and biomarkers. Various techniques for peptide and glycopeptide extraction, both at analytical and preparative scales, and available options for peptide detection with MS are discussed. Current algorithms for peptide sequence determination, and both analytical and computational techniques for quantification are compared. Techniques for statistical analysis, sequence mapping, enzyme prediction, and peptide function, and structure prediction are explored. PMID:25429922

  4. Experimental validation of equations for 2D DIC uncertainty quantification.

    SciTech Connect

    Reu, Phillip L.; Miller, Timothy J.

    2010-03-01

    Uncertainty quantification (UQ) equations have been derived for predicting matching uncertainty in two-dimensional image correlation a priori. These equations include terms that represent the image noise and image contrast. Researchers at the University of South Carolina have extended previous 1D work to calculate matching errors in 2D. These 2D equations have been coded into a Sandia National Laboratories UQ software package to predict the uncertainty for DIC images. This paper presents those equations and the resulting error surfaces for trial speckle images. Comparison of the UQ results with experimentally subpixel-shifted images is also discussed.

  5. Trace cancer biomarker quantification using polystyrene-functionalized gold nanorods

    PubMed Central

    Wu, Jian; Li, Wei; Hajisalem, Ghazal; Lukach, Ariella; Kumacheva, Eugenia; Hof, Fraser; Gordon, Reuven

    2014-01-01

    We demonstrate the application of polystyrene-functionalized gold nanorods (AuNRs) as a platform for surface enhanced Raman scattering (SERS) quantification of the exogenous cancer biomarker Acetyl Amantadine (AcAm). We utilize the hydrophobicity of the polystyrene attached to the AuNR surface to capture the hydrophobic AcAm from solution, followed by drying and detection using SERS. We achieve a detection limit of 16 ng/mL using this platform. This result shows clinical potential for low-cost early cancer detection. PMID:25574423

  6. Selected methods for quantification of community exposure to aircraft noise

    NASA Technical Reports Server (NTRS)

    Edge, P. M., Jr.; Cawthorn, J. M.

    1976-01-01

    A review of the state-of-the-art for the quantification of community exposure to aircraft noise is presented. Physical aspects, people response considerations, and practicalities of useful application of scales of measure are included. Historical background up through the current technology is briefly presented. The developments of both single-event and multiple-event scales are covered. Selective choice is made of scales currently in the forefront of interest and recommended methodology is presented for use in computer programing to translate aircraft noise data into predictions of community noise exposure. Brief consideration is given to future programing developments and to supportive research needs.

  7. Quantification of quantum discord in a antiferromagnetic Heisenberg compound

    SciTech Connect

    Singh, H. Chakraborty, T. Mitra, C.

    2014-04-24

    An experimental quantification of concurrence and quantum discord from heat capacity (C{sub p}) measurement performed over a solid state system has been reported. In this work, thermodynamic measurements were performed on copper nitrate (CN, Cu(NO{sub 3}){sub 2}⋅2.5H{sub 2}O) single crystals which is an alternating antiferromagnet Heisenberg spin 1/2 system. CN being a weak dimerized antiferromagnet is an ideal system to investigate correlations between spins. The theoretical expressions were used to obtain concurrence and quantum discord curves as a function of temperature from heat capacity data of a real macroscopic system, CN.

  8. Solid-phase colorimetric method for the quantification of fucoidan.

    PubMed

    Lee, Jung Min; Shin, Z-U; Mavlonov, Gafurjon T; Abdurakhmonov, Ibrokhim Y; Yi, Tae-Hoo

    2012-11-01

    We described the simple, selective, and rapid method for determination of fucoidans using methylene blue staining of sulfated polysaccharides, immobilized into filter paper and consequent optic density (at A (663) nm) measurement of the eluted dye from filter paper. This solid-phase method allows selective determination of 1-20 μg fucoidan in presence of potentially interfering compounds (alginic acid, DNA, salts, proteins, and detergents). Further, we demonstrated the alternative way of using image processing software for fucoidan quantification without extraction of methylene blue dye from stained spots of fucoidan-dye complex.

  9. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    NASA Technical Reports Server (NTRS)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  10. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    SciTech Connect

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene; Malinas, Rebecca; Peterson, Matthew Gregor; Phillips, Cynthia A.; Robinson, David G.; Woodbridge, Diane

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  11. Evaluation of different systems for clinical quantification of varicose veins.

    PubMed

    Cornu-Thénard, A; De Vincenzi, I; Maraval, M

    1991-04-01

    One hundred twenty-five lower limbs with varicose veins were studied clinically, essentially by palpation. Two specialists in venous pathology scored the severity of the varicose veins from 0 to 20. Comparison between the different clinical parameters and the scores of the specialists showed that two systems of clinical quantification gave good results and were easy to use. One system is the maximum diameter of the largest varicose vein; the other system is the sum of maximum diameters over 7 sections (3 for thigh, 3 for leg, 1 for foot). This latter system gives a more precise evaluation of the clinical severity of the varicose veins.

  12. Aspect-Oriented Programming is Quantification and Implicit Invocation

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Koga, Dennis (Technical Monitor)

    2001-01-01

    We propose that the distinguishing characteristic of Aspect-Oriented Programming (AOP) languages is that they allow programming by making quantified programmatic assertions over programs that lack local notation indicating the invocation of these assertions. This suggests that AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the interactions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are metabolism: they are sufficiently expressive to allow straightforwardly programming an AOP system within them.

  13. Uncertainty quantification in fission cross section measurements at LANSCE

    SciTech Connect

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  14. Uncertainty Quantification in Fission Cross Section Measurements at LANSCE

    SciTech Connect

    Tovesson, F.

    2015-01-15

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  15. Uncertainty quantification in fission cross section measurements at LANSCE

    DOE PAGES

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  16. Development of magnetic resonance technology for noninvasive boron quantification

    SciTech Connect

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.

  17. Automatic quantification of neurite outgrowth by means of image analysis

    NASA Astrophysics Data System (ADS)

    Van de Wouwer, Gert; Nuydens, Rony; Meert, Theo; Weyn, Barbara

    2004-07-01

    A system for quantification of neurite outgrowth in in-vitro experiments is described. The system is developed for routine use in a high-throughput setting and is therefore needs fast, cheap, and robust. It relies on automated digital microscopical imaging of microtiter plates. Image analysis is applied to extract features for characterisation of neurite outgrowth. The system is tested in a dose-response experiment on PC12 cells + Taxol. The performance of the system and its ability to measure changes on neuronal morphology is studied.

  18. Source-Code Instrumentation and Quantification of Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Aspect Oriented Programming (AOP) is making quantified programmatic assertions over programs that otherwise are not annotated to receive these assertions. Varieties of AOP systems are characterized by which quantified assertions they allow, what they permit in the actions of the assertions (including how the actions interact with the base code), and what mechanisms they use to achieve the overall effect. Here, we argue that all quantification is over dynamic events, and describe our preliminary work in developing a system that maps dynamic events to transformations over source code. We discuss possible applications of this system, particularly with respect to debugging concurrent systems.

  19. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    PubMed Central

    Xia, Kelin; Wei, Guo-Wei

    2014-01-01

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction. PMID:24697365

  20. Quantification and stability of salbutamol in human urine.

    PubMed

    Forsdahl, Guro; Gmeiner, Günter

    2004-01-01

    A sensitive method for the quantification of free salbutamol in human urine is described. Sample clean up is performed using SPE on a mixed phase extraction column. Derivatisation is performed with N-methyl-N-trimethylsilyltrifluoroacetamide (MSTFA) and the extract is analysed by GC-MS. The method was found to be suitable for use in the doping field, where a cut-off limit of 1 microg salbutamol/mL urine is set by the International Olympic Committee (IOC) and approved by the World Anti-Doping Agency (WADA). Above that value a doping violation occurs. In addition, the stability of salbutamol in human urine has been evaluated.

  1. Assessing Spontaneous Combustion Instability with Recurrence Quantification Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, Chad J.; Casiano, Matthew J.

    2016-01-01

    Spontaneous instabilities can pose a significant challenge to verification of combustion stability, and characterizing its onset is an important avenue of improvement for stability assessments of liquid propellant rocket engines. Recurrence Quantification Analysis (RQA) is used here to explore nonlinear combustion dynamics that might give insight into instability. Multiple types of patterns representative of different dynamical states are identified within fluctuating chamber pressure data, and markers for impending instability are found. A class of metrics which describe these patterns is also calculated. RQA metrics are compared with and interpreted against another metric from nonlinear time series analysis, the Hurst exponent, to help better distinguish between stable and unstable operation.

  2. Quantification of protein concentration using UV absorbance and Coomassie dyes.

    PubMed

    Noble, James E

    2014-01-01

    The measurement of a solubilized protein concentration in solution is an important assay in biochemistry research and development labs for applications ranging from enzymatic studies to providing data for biopharmaceutical lot release. Spectrophotometric protein quantification assays are methods that use UV and visible spectroscopy to rapidly determine the concentration of protein, relative to a standard, or using an assigned extinction coefficient. Where multiple samples need measurement, and/or the sample volume and concentration is limited, preparations of the Coomassie dye commonly known as the Bradford assay can be used. © 2014 Elsevier Inc. All rights reserved.

  3. Chemical agent detection and quantification with imaging spectrometry

    NASA Astrophysics Data System (ADS)

    Ifarraguerri, Augustin I.

    1999-10-01

    Passive standoff detection of chemical warfare (CW) agents is currently achieved by remote sensing infrared spectrometry in the 8 - 12 micrometer atmospheric window with the aid of automatic spectral analysis algorithms. Introducing an imaging capability would allow for rapid wide-area reconnaissance and mapping of vapor clouds, as well as reduce false alarms by exploiting the added spatial information. This paper contains an overview of the CW agent standoff detection problem and the challenges associated with developing imaging LWIR hyperspectral sensors for the detection and quantification of vapor clouds, as well as a discussion of spectral processing techniques which can be used to exploit the added data dimensionality.

  4. Growth and Quantification of MERS-CoV Infection

    PubMed Central

    Coleman, Christopher M.; Frieman, Matthew B.

    2015-01-01

    Middle East respiratory syndrome coronavirus (MERS-CoV) is an emerging highly pathogenic respiratory virus. Although MERS-CoV only emerged in 2012, we and others have developed assays to grow and quantify infectious MERS-CoV and RNA products of replication in vitro. MERS-CoV is able to infect a range of cell types, but replicates to high titers in Vero E6 cells. Protocols for the propagation and quantification of MERS-CoV are presented. PMID:26344219

  5. Progressive damage state evolution and quantification in composites

    NASA Astrophysics Data System (ADS)

    Patra, Subir; Banerjee, Sourav

    2016-04-01

    Precursor damage state quantification can be helpful for safety and operation of aircraft and defense equipment's. Damage develops in the composite material in the form of matrix cracking, fiber breakages and deboning, etc. However, detection and quantification of the damage modes at their very early stage is not possible unless modifications of the existing indispensable techniques are conceived, particularly for the quantification of multiscale damages at their early stage. Here, we present a novel nonlocal mechanics based damage detection technique for precursor damage state quantification. Micro-continuum physics is used by modifying the Christoffel equation. American society of testing and materials (ASTM) standard woven carbon fiber (CFRP) specimens were tested under Tension-Tension fatigue loading at the interval of 25,000 cycles until 500,000 cycles. Scanning Acoustic Microcopy (SAM) and Optical Microscopy (OM) were used to examine the damage development at the same interval. Surface Acoustic Wave (SAW) velocity profile on a representative volume element (RVE) of the specimen were calculated at the regular interval of 50,000 cycles. Nonlocal parameters were calculated form the micromorphic wave dispersion curve at a particular frequency of 50 MHz. We used a previously formulated parameter called "Damage entropy" which is a measure of the damage growth in the material calculated with the loading cycle. Damage entropy (DE) was calculated at every pixel on the RVE and the mean of DE was plotted at the loading interval of 25,000 cycle. Growth of DE with fatigue loading cycles was observed. Optical Imaging also performed at the interval of 25,000 cycles to investigate the development of damage inside the materials. We also calculated the mean value of the Surface Acoustic Wave (SAW) velocity and plotted with fatigue cycle which is correlated further with Damage Entropy (DE). Statistical analysis of the Surface Acoustic Wave profile (SAW) obtained at different

  6. Predicting Human Age with Bloodstains by sjTREC Quantification

    PubMed Central

    Wang, Huan; Wang, Hong-sheng; Lu, Hui-ling; Sun, Hong-yu

    2012-01-01

    The age-related decline of signal joint T-cell receptor rearrangement excision circles (sjTRECs) in human peripheral blood has been demonstrated in our previous study and other reports. Until now, only a few studies on sjTREC detection in bloodstain samples were reported, which were based on a small sample of subjects of a limited age range, although bloodstains are much more frequently encountered in forensic practice. In this present study, we adopted the sensitive Taqman real-time quantitative polymerase chain reaction (qPCR) method to perform sjTREC quantification in bloodstains from individuals ranging from 0–86 years old (n = 264). The results revealed that sjTREC contents in human bloodstains were declined in an age-dependent manner (r = −0.8712). The formula of age estimation was Age  = −7.1815Y−42.458±9.42 (Y dCtTBP-sjTREC; 9.42 standard error). Furthermore, we tested for the influence of short- or long- storage time by analyzing fresh and stored bloodstains from the same individuals. Remarkably, no statistically significant difference in sjTREC contents was found between the fresh and old DNA samples over a 4-week of storage time. However, significant loss (0.16–1.93 dCt) in sjTREC contents was detected after 1.5 years of storage in 31 samples. Moreover, preliminary sjTREC quantification from up to 20-year-old bloodstains showed that though the sjTREC contents were detectable in all samples and highly correlated with donor age, a time-dependent decrease in the correlation coefficient r was found, suggesting the predicting accuracy of this described assay would be deteriorated in aged samples. Our findings show that sjTREC quantification might be also suitable for age prediction in bloodstains, and future researches into the time-dependent or other potential impacts on sjTREC quantification might allow further improvement of the predicting accuracy. PMID:22879970

  7. Quantification of Lung Metastases from In Vivo Mouse Models.

    PubMed

    Chang, Joan; Erler, Janine T

    2016-01-01

    Cancer research has made significant progress in terms of understanding and targeting primary tumors; however, the challenge remains for the successful treatment of metastatic cancers. This highlights the importance to use in vivo models to study the metastatic process, as well as for preclinical testing of compounds that could inhibit metastasis. As a result, proper quantification of metastases from in vivo models is of the utmost significance. Here, we provide a detailed protocol for collecting and handling lung tissues from mice, and guidance for subsequent analysis of metastases, as well as interpretation of data.

  8. Uncertainty quantification in ion-solid interaction simulations

    NASA Astrophysics Data System (ADS)

    Preuss, R.; von Toussaint, U.

    2017-02-01

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion-solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  9. Nuclear magnetic resonance-based quantification of organic diphosphates.

    PubMed

    Lenevich, Stepan; Distefano, Mark D

    2011-01-15

    Phosphorylated compounds are ubiquitous in life. Given their central role, many such substrates and analogs have been prepared for subsequent evaluation. Prior to biological experiments, it is typically necessary to determine the concentration of the target molecule in solution. Here we describe a method where concentrations of stock solutions of organic diphosphates and bisphosphonates are quantified using (31)P nuclear magnetic resonance (NMR) spectroscopy with standard instrumentation using a capillary tube with a secondary standard. The method is specific and is applicable down to a concentration of 200 μM. The capillary tube provides the reference peak for quantification and deuterated solvent for locking.

  10. Quantification of skin wrinkles using low coherence interferometry

    NASA Astrophysics Data System (ADS)

    Oh, Jung-Taek; Kim, Beop-Min; Son, Sang-Ryoon; Lee, Sang-Won; Kim, Dong-Yoon; Kim, Youn-Soo

    2004-07-01

    We measure the skin wrinkle topology by means of low coherence interferometry (LCI), which forms the basis of the optical coherence tomography (OCT). The skin topology obtained using LCI and corresponding 2-D fast Fourier transform allow quantification of skin wrinkles. It took approximately 2 minutes to obtain 2.1 mm x 2.1 mm topological image with 4 um and 16 um resolutions in axial and transverse directions, respectively. Measurement examples show the particular case of skin contour change after-wrinkle cosmeceutical treatments and atopic dermatitis

  11. A recipe for EFT uncertainty quantification in nuclear physics

    NASA Astrophysics Data System (ADS)

    Furnstahl, R. J.; Phillips, D. R.; Wesolowski, S.

    2015-03-01

    The application of effective field theory (EFT) methods to nuclear systems provides the opportunity to rigorously estimate the uncertainties originating in the nuclear Hamiltonian. Yet this is just one source of uncertainty in the observables predicted by calculations based on nuclear EFTs. We discuss the goals of uncertainty quantification in such calculations and outline a recipe to obtain statistically meaningful error bars for their predictions. We argue that the different sources of theory error can be accounted for within a Bayesian framework, as we illustrate using a toy model.

  12. Recurrence plots and recurrence quantification analysis of human motion data

    NASA Astrophysics Data System (ADS)

    Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad

    2016-06-01

    The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.

  13. Prospective Comparison of Liver Stiffness Measurements between Two Point Shear Wave Elastography Methods: Virtual Touch Quantification and Elastography Point Quantification.

    PubMed

    Yoo, Hyunsuk; Lee, Jeong Min; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo

    2016-01-01

    To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ(2) analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement.

  14. Accurate quantification of supercoiled DNA by digital PCR.

    PubMed

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-04-11

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry.

  15. Accurate quantification of supercoiled DNA by digital PCR

    PubMed Central

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-01-01

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry. PMID:27063649

  16. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  17. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI

    NASA Astrophysics Data System (ADS)

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-06-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  18. Quantification of Methylated Selenium, Sulfur, and Arsenic in the Environment

    PubMed Central

    Vriens, Bas; Ammann, Adrian A.; Hagendorfer, Harald; Lenz, Markus; Berg, Michael; Winkel, Lenny H. E.

    2014-01-01

    Biomethylation and volatilization of trace elements may contribute to their redistribution in the environment. However, quantification of volatile, methylated species in the environment is complicated by a lack of straightforward and field-deployable air sampling methods that preserve element speciation. This paper presents a robust and versatile gas trapping method for the simultaneous preconcentration of volatile selenium (Se), sulfur (S), and arsenic (As) species. Using HPLC-HR-ICP-MS and ESI-MS/MS analyses, we demonstrate that volatile Se and S species efficiently transform into specific non-volatile compounds during trapping, which enables the deduction of the original gaseous speciation. With minor adaptations, the presented HPLC-HR-ICP-MS method also allows for the quantification of 13 non-volatile methylated species and oxyanions of Se, S, and As in natural waters. Application of these methods in a peatland indicated that, at the selected sites, fluxes varied between 190–210 ng Se·m−2·d−1, 90–270 ng As·m−2·d−1, and 4–14 µg S·m−2·d−1, and contained at least 70% methylated Se and S species. In the surface water, methylated species were particularly abundant for As (>50% of total As). Our results indicate that methylation plays a significant role in the biogeochemical cycles of these elements. PMID:25047128

  19. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  20. Quantification of human urinary exosomes by nanoparticle tracking analysis.

    PubMed

    Oosthuyzen, Wilna; Sime, Nicole E L; Ivy, Jessica R; Turtle, Emma J; Street, Jonathan M; Pound, John; Bath, Louise E; Webb, David J; Gregory, Christopher D; Bailey, Matthew A; Dear, James W

    2013-12-01

    Exosomes are vesicles that are released from the kidney into urine. They contain protein and RNA from the glomerulus and all sections of the nephron and represent a reservoir for biomarker discovery. Current methods for the identification and quantification of urinary exosomes are time consuming and only semi-quantitative. Nanoparticle tracking analysis (NTA) counts and sizes particles by measuring their Brownian motion in solution. In this study, we applied NTA to human urine and identified particles with a range of sizes. Using antibodies against the exosomal proteins CD24 and aquaporin 2 (AQP2), conjugated to a fluorophore, we could identify a subpopulation of CD24- and AQP2-positive particles of characteristic exosomal size. Extensive pre-NTA processing of urine was not necessary. However, the intra-assay variability in the measurement of exosome concentration was significantly reduced when an ultracentrifugation step preceded NTA. Without any sample processing, NTA tracked exosomal AQP2 upregulation induced by desmopressin stimulation of kidney collecting duct cells. Nanoparticle tracking analysis was also able to track changes in exosomal AQP2 concentration that followed desmopressin treatment of mice and a patient with central diabetes insipidus. When urine was stored at room temperature, 4°C or frozen, nanoparticle concentration was reduced; freezing at -80°C with the addition of protease inhibitors produced the least reduction. In conclusion, with appropriate sample storage, NTA has potential as a tool for the characterization and quantification of extracellular vesicles in human urine.

  1. Quantification of human urinary exosomes by nanoparticle tracking analysis

    PubMed Central

    Oosthuyzen, Wilna; Sime, Nicole E L; Ivy, Jessica R; Turtle, Emma J; Street, Jonathan M; Pound, John; Bath, Louise E; Webb, David J; Gregory, Christopher D; Bailey, Matthew A; Dear, James W

    2013-01-01

    Exosomes are vesicles that are released from the kidney into urine. They contain protein and RNA from the glomerulus and all sections of the nephron and represent a reservoir for biomarker discovery. Current methods for the identification and quantification of urinary exosomes are time consuming and only semi-quantitative. Nanoparticle tracking analysis (NTA) counts and sizes particles by measuring their Brownian motion in solution. In this study, we applied NTA to human urine and identified particles with a range of sizes. Using antibodies against the exosomal proteins CD24 and aquaporin 2 (AQP2), conjugated to a fluorophore, we could identify a subpopulation of CD24- and AQP2-positive particles of characteristic exosomal size. Extensive pre-NTA processing of urine was not necessary. However, the intra-assay variability in the measurement of exosome concentration was significantly reduced when an ultracentrifugation step preceded NTA. Without any sample processing, NTA tracked exosomal AQP2 upregulation induced by desmopressin stimulation of kidney collecting duct cells. Nanoparticle tracking analysis was also able to track changes in exosomal AQP2 concentration that followed desmopressin treatment of mice and a patient with central diabetes insipidus. When urine was stored at room temperature, 4°C or frozen, nanoparticle concentration was reduced; freezing at −80°C with the addition of protease inhibitors produced the least reduction. In conclusion, with appropriate sample storage, NTA has potential as a tool for the characterization and quantification of extracellular vesicles in human urine. PMID:24060994

  2. Quantification of red blood cells using atomic force microscopy.

    PubMed

    O'Reilly, M; McDonnell, L; O'Mullane, J

    2001-01-01

    For humans the sizes and shapes of their red blood cells are important indicators of well being. In this study, the feasibility of using the atomic force microscope (AFM) to provide the sizes and shapes of red blood cells has been investigated. An immobilisation procedure has been developed that enabled red blood cells to be reliably imaged by contact AFM in air. The shapes of the red blood cells were readily apparent in the AFM images. Various cell quantification parameters were investigated, including thickness, width, surface area and volume. Excellent correlation was found between the AFM-derived immobilised mean cell volume (IMCV) parameter and the mean cell volume (MCV) parameter used in current haematological practice. The correlation between MCV and IMCV values has validated the immobilisation procedure by demonstrating that the significant cell shrinkage that occurs during immobilisation and drying does not introduce quantification artifacts. Reliable IMCV values were obtained by quantifying 100 red blood cells and this typically required 3-5 AFM images of 100 microm x 100 microm area. This work has demonstrated that the AFM can provide in a single test the red blood cell size and shape data needed in the assessment of human health.

  3. Quantification of HEV RNA by Droplet Digital PCR

    PubMed Central

    Nicot, Florence; Cazabat, Michelle; Lhomme, Sébastien; Marion, Olivier; Sauné, Karine; Chiabrando, Julie; Dubois, Martine; Kamar, Nassim; Abravanel, Florence; Izopet, Jacques

    2016-01-01

    The sensitivity of real-time PCR for hepatitis E virus (HEV) RNA quantification differs greatly among techniques. Standardized tools that measure the real quantity of virus are needed. We assessed the performance of a reverse transcription droplet digital PCR (RT-ddPCR) assay that gives absolute quantities of HEV RNA. Analytical and clinical validation was done on HEV genotypes 1, 3 and 4, and was based on open reading frame (ORF)3 amplification. The within-run and between-run reproducibilities were very good, the analytical sensitivity was 80 HEV RNA international units (IU)/mL and linearities of HEV genotype 1, 3 and 4 were very similar. Clinical validation based on 45 samples of genotype 1, 3 or 4 gave results that correlated well with a validated reverse transcription quantitative PCR (RT-qPCR) assay (Spearman rs = 0.89, p < 0.0001). The RT-ddPCR assay is a sensitive method and could be a promising tool for standardizing HEV RNA quantification in various sample types. PMID:27548205

  4. Simple and inexpensive quantification of ammonia in whole blood.

    PubMed

    Ayyub, Omar B; Behrens, Adam M; Heligman, Brian T; Natoli, Mary E; Ayoub, Joseph J; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μL of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p = 0.0001.

  5. [Quantification of ocular dominance for better management of eye disease].

    PubMed

    Chaumillon, R; Alahyane, N; Senot, P; Vergne, J; Lemoine, C; Doré-Mazars, K; Blouin, J; Vergilino-Perez, D; Guillaume, A

    2015-04-01

    The dominant eye is defined as the one we unconsciously choose when we have to perform monocular tasks. In the field of clinical neuro-ophthalmology, it is well-established that ocular dominance plays a key role in several eye diseases. Furthermore, the accurate quantification of ocular dominance is crucial with regard to certain surgical techniques. However, classical preoperative tests cannot determine the amount of ocular dominance. In order to obtain further insight into the phenomenon of ocular dominance, we study its influence at behavioral and neurophysiological levels (experiments 1 and 2). Based on these new data, we suggest a method to improve quantification of ocular dominance (experiment 3). We demonstrate that ocular dominance has an influence on hand movements and on interhemispheric transfer time. Moreover, we show that an analysis of the dynamics of saccades allows us to sort out participants with strong or weak ocular dominance. In conclusion, this better understanding of the phenomenon of ocular dominance, coupled with the analysis of saccadic dynamics, might, in the short or medium term, lead to the establishment of a quick and straightforward battery of tests allowing determination of the amount of ocular dominance for each patient. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  6. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI.

    PubMed

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-10-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  7. Simple and Inexpensive Quantification of Ammonia in Whole Blood

    PubMed Central

    Ayyub, Omar B.; Behrens, Adam M.; Heligman, Brian T.; Natoli, Mary E.; Ayoub, Joseph J.; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μl of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p=0.0001. PMID:25936660

  8. Ultrasound strain imaging for quantification of tissue function: cardiovascular applications

    NASA Astrophysics Data System (ADS)

    de Korte, Chris L.; Lopata, Richard G. P.; Hansen, Hendrik H. G.

    2013-03-01

    With ultrasound imaging, the motion and deformation of tissue can be measured. Tissue can be deformed by applying a force on it and the resulting deformation is a function of its mechanical properties. Quantification of this resulting tissue deformation to assess the mechanical properties of tissue is called elastography. If the tissue under interrogation is actively deforming, the deformation is directly related to its function and quantification of this deformation is normally referred as `strain imaging'. Elastography can be used for atherosclerotic plaques characterization, while the contractility of the heart or skeletal muscles can be assessed with strain imaging. We developed radio frequency (RF) based ultrasound methods to assess the deformation at higher resolution and with higher accuracy than commercial methods using conventional image data (Tissue Doppler Imaging and 2D speckle tracking methods). However, the improvement in accuracy is mainly achieved when measuring strain along the ultrasound beam direction, so 1D. We further extended this method to multiple directions and further improved precision by using compounding of data acquired at multiple beam steered angles. In arteries, the presence of vulnerable plaques may lead to acute events like stroke and myocardial infarction. Consequently, timely detection of these plaques is of great diagnostic value. Non-invasive ultrasound strain compounding is currently being evaluated as a diagnostic tool to identify the vulnerability of plaques. In the heart, we determined the strain locally and at high resolution resulting in a local assessment in contrary to conventional global functional parameters like cardiac output or shortening fraction.

  9. Sludge quantification at water treatment plant and its management scenario.

    PubMed

    Ahmad, Tarique; Ahmad, Kafeel; Alam, Mehtab

    2017-08-15

    Large volume of sludge is generated at the water treatment plants during the purification of surface water for potable supplies. Handling and disposal of sludge require careful attention from civic bodies, plant operators, and environmentalists. Quantification of the sludge produced at the treatment plants is important to develop suitable management strategies for its economical and environment friendly disposal. Present study deals with the quantification of sludge using empirical relation between turbidity, suspended solids, and coagulant dosing. Seasonal variation has significant effect on the raw water quality received at the water treatment plants so forth sludge generation also varies. Yearly production of the sludge in a water treatment plant at Ghaziabad, India, is estimated to be 29,700 ton. Sustainable disposal of such a quantity of sludge is a challenging task under stringent environmental legislation. Several beneficial reuses of sludge in civil engineering and constructional work have been identified globally such as raw material in manufacturing cement, bricks, and artificial aggregates, as cementitious material, and sand substitute in preparing concrete and mortar. About 54 to 60% sand, 24 to 28% silt, and 16% clay constitute the sludge generated at the water treatment plant under investigation. Characteristics of the sludge are found suitable for its potential utilization as locally available construction material for safe disposal. An overview of the sustainable management scenario involving beneficial reuses of the sludge has also been presented.

  10. High-throughput quantification of early stages of phagocytosis.

    PubMed

    Yeo, Jeremy Changyu; Wall, Adam Alexander; Stow, Jennifer Lea; Hamilton, Nicholas Ahti

    2013-09-01

    Phagocytosis--the engulfment of cells and foreign bodies--is an important cellular process in innate immunity, development, and disease. Quantification of various stages of phagocytosis, especially in a rapid screening fashion, is an invaluable tool for elucidating protein function during this process. However, current methods for assessing phagocytosis are largely limited to flow cytometry and manual image-based assays, providing limited information. Here, we present an image-based, semi-automated phagocytosis assay to rapidly quantitate three distinct stages during the early engulfment of opsonized beads. Captured images are analyzed using the image-processing software ImageJ and quantified using a macro. Modifications to this method allowed quantification of phagocytosis only in fluorescently labeled transfected cells. Additionally, the time course of bead internalization could be measured using this approach. The assay could discriminate perturbations to stages of phagocytosis induced by known pharmacological inhibitors of filamentous actin and phosphoinositol-3-kinase. Our methodology offers the ability to automatically categorize large amounts of image data into the three early stages of phagocytosis within minutes, clearly demonstrating its potential value in investigating aberrant phagocytosis when manipulating proteins of interest in drug screens and disease.

  11. A critical view on microplastic quantification in aquatic organisms.

    PubMed

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J J; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-11-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g(-1) w.w. for the Acid mix Method and 0.12±0.04 total microplastics g(-1) w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g(-1) w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  12. Applying uncertainty quantification to multiphase flow computational fluid dynamics

    SciTech Connect

    Gel, A; Garg, R; Tong, C; Shahnam, M; Guenther, C

    2013-07-01

    Multiphase computational fluid dynamics plays a major role in design and optimization of fossil fuel based reactors. There is a growing interest in accounting for the influence of uncertainties associated with physical systems to increase the reliability of computational simulation based engineering analysis. The U.S. Department of Energy's National Energy Technology Laboratory (NETL) has recently undertaken an initiative to characterize uncertainties associated with computer simulation of reacting multiphase flows encountered in energy producing systems such as a coal gasifier. The current work presents the preliminary results in applying non-intrusive parametric uncertainty quantification and propagation techniques with NETL's open-source multiphase computational fluid dynamics software MFIX. For this purpose an open-source uncertainty quantification toolkit, PSUADE developed at the Lawrence Livermore National Laboratory (LLNL) has been interfaced with MFIX software. In this study, the sources of uncertainty associated with numerical approximation and model form have been neglected, and only the model input parametric uncertainty with forward propagation has been investigated by constructing a surrogate model based on data-fitted response surface for a multiphase flow demonstration problem. Monte Carlo simulation was employed for forward propagation of the aleatory type input uncertainties. Several insights gained based on the outcome of these simulations are presented such as how inadequate characterization of uncertainties can affect the reliability of the prediction results. Also a global sensitivity study using Sobol' indices was performed to better understand the contribution of input parameters to the variability observed in response variable.

  13. Cell-based quantification of molecular biomarkers in histopathology specimens.

    PubMed

    Al-Kofahi, Yousef; Lassoued, Wiem; Grama, Kedar; Nath, Sumit K; Zhu, Jianliang; Oueslati, Ridha; Feldman, Michael; Lee, William M F; Roysam, Badrinath

    2011-07-01

    To investigate the use of a computer-assisted technology for objective, cell-based quantification of molecular biomarkers in specified cell types in histopathology specimens, with the aim of advancing current visual estimation and pixel-level (rather than cell-based) quantification methods. Tissue specimens were multiplex-immunostained to reveal cell structures, cell type markers, and analytes, and imaged with multispectral microscopy. The image data were processed with novel software that automatically delineates and types each cell in the field, measures morphological features, and quantifies analytes in different subcellular compartments of specified cells.The methodology was validated with the use of cell blocks composed of differentially labelled cultured cells mixed in known proportions, and evaluated on human breast carcinoma specimens for quantifying human epidermal growth factor receptor 2, estrogen receptor, progesterone receptor, Ki67, phospho-extracellular signal-related kinase, and phospho-S6. Automated cell-level analyses closely matched human assessments, but, predictably, differed from pixel-level analyses of the same images. Our method reveals the type, distribution, morphology and biomarker state of each cell in the field, and allows multiple biomarkers to be quantified over specified cell types, regardless of their abundance. It is ideal for studying specimens from patients in clinical trials of targeted therapeutic agents, for investigating minority stromal cell subpopulations, and for phenotypic characterization to personalize therapy and prognosis. © 2011 Blackwell Publishing Limited.

  14. Cell-based quantification of molecular biomarkers in histopathology specimens

    PubMed Central

    Al-Kofahi, Yousef; Lassoued, Wiem; Grama, Kedar; Nath, Sumit K; Zhu, Jianliang; Oueslati, Ridha; Feldman, Michael; Lee, William M F; Roysam, Badrinath

    2011-01-01

    Aims To investigate the use of a computer-assisted technology for objective, cell-based quantification of molecular biomarkers in specified cell types in histopathology specimens, with the aim of advancing current visual estimation or pixel-level (rather than cell-based) quantification methods. Methods and results Tissue specimens were multiplex-immunostained to reveal cell structures, cell type markers, and analytes, and imaged with multispectral microscopy. The image data were processed with novel software that automatically delineates and types each cell in the field, measures morphological features, and quantifies analytes in different subcellular compartments of specified cells. The methodology was validated with the use of cell blocks composed of differentially labelled cultured cells mixed in known proportions, and evaluated on human breast carcinoma specimens for quantifying human epidermal growth factor receptor 2, oestrogen receptor, progesterone receptor, Ki67, phospho-extracellular signal-related kinase, and phospho-S6. Automated cell-level analyses closely matched human assessments, but, predictably, differed from pixel-level analyses of the same images. Conclusions Our method reveals the type, distribution, morphology and biomarker state of each cell in the field, and allows multiple biomarkers to be quantified over specified cell types, regardless of abundance. It is ideal for studying specimens from patients in clinical trials of targeted therapeutic agents, for investigating minority stromal cell subpopulations, and for phenotypic characterization to personalize therapy and prognosis. PMID:21771025

  15. Quantification of nerve agent biomarkers in human serum and urine.

    PubMed

    Røen, Bent Tore; Sellevåg, Stig Rune; Lundanes, Elsa

    2014-12-02

    A novel method for rapid and sensitive quantification of the nerve agent metabolites ethyl, isopropyl, isobutyl, cyclohexyl, and pinacolyl methylphosphonic acid has been established by combining salting-out assisted liquid-liquid extraction (SALLE) and online solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS/MS). The procedure allows confirmation of nerve agent exposure within 30 min from receiving a sample, with very low detection limits for the biomarkers of 0.04-0.12 ng/mL. Sample preparation by SALLE was performed in less than 10 min, with a common procedure for both serum and urine. Analyte recoveries of 70-100% were obtained using tetrahydrofuran as extraction solvent and Na2SO4 to achieve phase separation. After SALLE, selective analyte retention was obtained on a ZrO2 column by Lewis acid-base and hydrophilic interactions with acetonitrile/1% CH3COOH (82/18) as the loading mobile phase. The phosphonic acids were backflush-desorbed onto a polymeric zwitterionic column at pH 9.8 and separated by hydrophilic interaction liquid chromatography. The method was linear (R(2) ≥ 0.995) from the limits of quantification to 50 ng/mL, and the within- and between-assay repeatability at 20 ng/mL were below 5% and 10% relative standard deviation, respectively.

  16. Quantification of punctate iron sources using magnetic resonance phase.

    PubMed

    McAuley, Grant; Schrag, Matthew; Sipos, Pál; Sun, Shu-Wei; Obenaus, Andre; Neelavalli, Jaladhar; Haacke, E Mark; Holshouser, Barbara; Madácsi, Ramóna; Kirsch, Wolff

    2010-01-01

    Iron-mediated tissue damage is present in cerebrovascular and neurodegenerative diseases and neurotrauma. Brain microbleeds are often present in these maladies and are assuming increasing clinical importance. Because brain microbleeds present a source of pathologic iron to the brain, the noninvasive quantification of this iron pool is potentially valuable. Past efforts to quantify brain iron have focused on content estimation within distributed brain regions. In addition, conventional approaches using "magnitude" images have met significant limitations. In this study, a technique is presented to quantify the iron content of punctate samples using phase images. Samples are modeled as magnetic dipoles and phase shifts due to local dipole field perturbations are mathematically related to sample iron content and radius using easily recognized geometric features in phase images. Phantoms containing samples of a chitosan-ferric oxyhydroxide composite (which serves as a mimic for hemosiderin) were scanned with a susceptibility-weighted imaging sequence at 11.7 T. Plots relating sample iron content and radius to phase image features were compared to theoretical predictions. The primary result is the validation of the technique by the excellent agreement between theory and the iron content plot. This research is a potential first step toward quantification of punctate brain iron sources such as brain microbleeds.

  17. A Spanish model for quantification and management of construction waste.

    PubMed

    Solís-Guzmán, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramírez-de-Arellano, Antonio

    2009-09-01

    Currently, construction and demolition waste (C&D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C&D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C&D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C&D waste volume in both new construction and demolition projects.

  18. Pancreas++: Automated Quantification of Pancreatic Islet Cells in Microscopy Images

    PubMed Central

    Chen, Hongyu; Martin, Bronwen; Cai, Huan; Fiori, Jennifer L.; Egan, Josephine M.; Siddiqui, Sana; Maudsley, Stuart

    2013-01-01

    The microscopic image analysis of pancreatic Islet of Langerhans morphology is crucial for the investigation of diabetes and metabolic diseases. Besides the general size of the islet, the percentage and relative position of glucagon-containing alpha-, and insulin-containing beta-cells is also important for pathophysiological analyses, especially in rodents. Hence, the ability to identify, quantify and spatially locate peripheral, and “involuted” alpha-cells in the islet core is an important analytical goal. There is a dearth of software available for the automated and sophisticated positional quantification of multiple cell types in the islet core. Manual analytical methods for these analyses, while relatively accurate, can suffer from a slow throughput rate as well as user-based biases. Here we describe a newly developed pancreatic islet analytical software program, Pancreas++, which facilitates the fully automated, non-biased, and highly reproducible investigation of islet area and alpha- and beta-cell quantity as well as position within the islet for either single or large batches of fluorescent images. We demonstrate the utility and accuracy of Pancreas++ by comparing its performance to other pancreatic islet size and cell type (alpha, beta) quantification methods. Our Pancreas++ analysis was significantly faster than other methods, while still retaining low error rates and a high degree of result correlation with the manually generated reference standard. PMID:23293605

  19. A Spanish model for quantification and management of construction waste

    SciTech Connect

    Solis-Guzman, Jaime Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramirez-de-Arellano, Antonio

    2009-09-15

    Currently, construction and demolition waste (C and D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C and D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C and D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C and D waste volume in both new construction and demolition projects.

  20. Concurrent quantification of tryptophan and its major metabolites

    PubMed Central

    Lesniak, Wojciech G.; Jyoti, Amar; Mishra, Manoj K.; Louissaint, Nicolette; Romero, Roberto; Chugani, Diane C.; Kannan, Sujatha; Kannan, Rangaramanujam M.

    2014-01-01

    An imbalance in tryptophan (TRP) metabolites is associated with several neurological and inflammatory disorders. Therefore, analytical methods allowing for simultaneous quantification of TRP and its major metabolites would be highly desirable, and may be valuable as potential biomarkers. We have developed a HPLC method for concurrent quantitative determination of tryptophan, serotonin, 5-hydroxyindoleacetic acid, kynurenine, and kynurenic acid in tissue and fluids. The method utilizes the intrinsic spectroscopic properties of TRP and its metabolites that enable UV absorbance and fluorescence detection by HPLC, without additional labeling. The origin of the peaks related to analytes of interest was confirmed by UV–Vis spectral patterns using a PDA detector and mass spectrometry. The developed methods were validated in rabbit fetal brain and amniotic fluid at gestational day 29. Results are in excellent agreement with those reported in the literature for the same regions. This method allows for rapid quantification of tryptophan and four of its major metabolites concurrently. A change in the relative ratios of these metabolites can provide important insights in predicting the presence and progression of neuroinflammation in disorders such as cerebral palsy, autism, multiple sclerosis, Alzheimer disease, and schizophrenia. PMID:24036037

  1. Quantification of Partially Ordered Sets with Application to Special Relativity

    NASA Astrophysics Data System (ADS)

    Bahreyni, Newshaw; Knuth, Kevin H.

    2011-03-01

    A partially ordered set is a set of elements ordered by a binary ordering relation. We have shown that a subset of a partially ordered set can be quantified by projecting elements onto a pair of chains where the elements of each chain are quantified by real numbers. This results in a quantification based on pairs of real numbers (pair). Intervals, defined by pairs of elements, can be quantified similarly. A pair can be decomposed into a sum of a symmetric pair and an antisymmetric pair and mapped to a unique scalar which results in the Minkowskian form. Changing the basis of quantification from one pair of chains to another, under special conditions, leads to the generalized Lorentz transformation for pairs. We apply these results to a causally-ordered set of events by identifying a chain of events with an observer equipped with a clock in an inertial frame. We obtain the Minkowski metric of flat space-time as well as Lorentz transformations, which results in there being a maximum invariant speed. We find that the mathematics of special relativity arises from quantifying causal relationships among events, and requires neither the principle of relativity nor the fact that the speed of light is constant.

  2. Dimensionality reduction for uncertainty quantification of nuclear engineering models.

    SciTech Connect

    Roderick, O.; Wang, Z.; Anitescu, M.

    2011-01-01

    The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and efficiency of engineering projects. In our previous work, we have shown that the effect of uncertainty can be approximated by polynomial regression with derivatives (PRD): a hybrid regression method that uses first-order derivatives of the model output as additional fitting conditions for a polynomial expansion. Numerical experiments have demonstrated the advantage of this approach over classical methods of uncertainty analysis: in precision, computational efficiency, or both. To obtain derivatives, we used automatic differentiation (AD) on the simulation code; hand-coded derivatives are acceptable for simpler models. We now present improvements on the method. We use a tuned version of the method of snapshots, a technique based on proper orthogonal decomposition (POD), to set up the reduced order representation of essential information on uncertainty in the model inputs. The automatically obtained sensitivity information is required to set up the method. Dimensionality reduction in combination with PRD allows analysis on a larger dimension of the uncertainty space (>100), at modest computational cost.

  3. Volumetric loss quantification using ultrasonic inductively coupled transducers

    NASA Astrophysics Data System (ADS)

    Gong, Peng; Hay, Thomas R.; Greve, David W.; Oppenheim, Irving J.

    2015-03-01

    The pulse-echo method is widely used for plate and pipe thickness measurement. However, the pulse echo method does not work well for detecting localized volumetric loss in thick-wall tubes, as created by erosion damage, when the morphology of volumetric loss is irregular and can reflect ultrasonic pulses away from the transducer, making it difficult to detect an echo. In this paper, we propose a novel method using an inductively coupled transducer to generate longitudinal waves propagating in a thick-wall aluminum tube for the volumetric loss quantification. In the experiment, longitudinal waves exhibit diffraction effects during the propagation which can be explained by the Huygens-Fresnel principle. The diffractive waves are also shown to be significantly delayed by the machined volumetric loss on the inside surface of the thick-wall aluminum tube. It is also shown that the inductively coupled transducers can generate and receive similar ultrasonic waves to those from wired transducers, and the inductively coupled transducers perform as well as the wired transducers in the volumetric loss quantification when other conditions are the same.

  4. Selected Reaction Monitoring Mass Spectrometry for Absolute Protein Quantification.

    PubMed

    Manes, Nathan P; Mann, Jessica M; Nita-Lazar, Aleksandra

    2015-08-17

    Absolute quantification of target proteins within complex biological samples is critical to a wide range of research and clinical applications. This protocol provides step-by-step instructions for the development and application of quantitative assays using selected reaction monitoring (SRM) mass spectrometry (MS). First, likely quantotypic target peptides are identified based on numerous criteria. This includes identifying proteotypic peptides, avoiding sites of posttranslational modification, and analyzing the uniqueness of the target peptide to the target protein. Next, crude external peptide standards are synthesized and used to develop SRM assays, and the resulting assays are used to perform qualitative analyses of the biological samples. Finally, purified, quantified, heavy isotope labeled internal peptide standards are prepared and used to perform isotope dilution series SRM assays. Analysis of all of the resulting MS data is presented. This protocol was used to accurately assay the absolute abundance of proteins of the chemotaxis signaling pathway within RAW 264.7 cells (a mouse monocyte/macrophage cell line). The quantification of Gi2 (a heterotrimeric G-protein α-subunit) is described in detail.

  5. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography

    PubMed Central

    Loss, Leandro A.; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2016-01-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides. PMID:28090597

  6. PROSAD: a powerful platform for instrument calibration and quantification.

    PubMed

    Floridia, Matteo; Cristoni, Simone

    2014-03-15

    There is critical need for instrument calibration and correction procedures that can improve the quality of mass spectral quantitative results. Currently, mass spectrometry (MS) technologies suffer from certain biases related to instrumental responses, which tend to restrict its application. To overcome these biases, we developed the PROgressive SAmple Dosage (PROSAD) platform and tested it. PROSAD, an optimized sample preparation and data analysis method, is used in conjunction with a liquid chromatography (LC)/MS system and a low-voltage ionization source (e.g., no-discharge atmospheric pressure chemical ionization (ND-APCI)). The mass spectrometers used for this report were an HCT Ultra ion trap, a LTQ XL Orbitrap, and a TSQ Vantage triple-stage quadrupole. The PROSAD elaborative system, because of its dedicated mathematical algorithm, provided a dynamic linear calibration check and correction. We tested PROSAD using a leucomalachite green-fish homogenate assay. Atrazine in tea matrix samples were also quantified. Better quantification was achieved using PROSAD compared with the classic linear, static calibration procedure in both test cases. PROSAD provides a dynamically optimized calibration curve that affords increased stability, accuracy, and precision for the quantification of MS data. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Image quantification of high-throughput tissue microarray

    NASA Astrophysics Data System (ADS)

    Wu, Jiahua; Dong, Junyu; Zhou, Huiyu

    2006-03-01

    Tissue microarray (TMA) technology allows rapid visualization of molecular targets in thousands of tissue specimens at a time and provides valuable information on expression of proteins within tissues at a cellular and sub-cellular level. TMA technology overcomes the bottleneck of traditional tissue analysis and allows it to catch up with the rapid advances in lead discovery. Studies using TMA on immunohistochemistry (IHC) can produce a large amount of images for interpretation within a very short time. Manual interpretation does not allow accurate quantitative analysis of staining to be undertaken. Automatic image capture and analysis has been shown to be superior to manual interpretation. The aims of this work is to develop a truly high-throughput and fully automated image capture and analysis system. We develop a robust colour segmentation algorithm using hue-saturation-intensity (HSI) colour space to provide quantification of signal intensity and partitioning of staining on high-throughput TMA. Initial segmentation results and quantification data have been achieved on 16,000 TMA colour images over 23 different tissue types.

  8. Automated quantification of nuclear immunohistochemical markers with different complexity.

    PubMed

    López, Carlos; Lejeune, Marylène; Salvadó, María Teresa; Escrivà, Patricia; Bosch, Ramón; Pons, Lluis E; Alvaro, Tomás; Roig, Jordi; Cugat, Xavier; Baucells, Jordi; Jaén, Joaquín

    2008-03-01

    Manual quantification of immunohistochemically stained nuclear markers is still laborious and subjective and the use of computerized systems for digital image analysis have not yet resolved the problems of nuclear clustering. In this study, we designed a new automatic procedure for quantifying various immunohistochemical nuclear markers with variable clustering complexity. This procedure consisted of two combined macros. The first, developed with a commercial software, enabled the analysis of the digital images using color and morphological segmentation including a masking process. All information extracted with this first macro was automatically exported to an Excel datasheet, where a second macro composed of four different algorithms analyzed all the information and calculated the definitive number of positive nuclei for each image. One hundred and eighteen images with different levels of clustering complexity was analyzed and compared with the manual quantification obtained by a trained observer. Statistical analysis indicated a great reliability (intra-class correlation coefficient > 0.950) and no significant differences between the two methods. Bland-Altman plot and Kaplan-Meier curves indicated that the results of both methods were concordant around 90% of analyzed images. In conclusion, this new automated procedure is an objective, faster and reproducible method that has an excellent level of accuracy, even with digital images with a high complexity.

  9. Genome-scale Proteome Quantification by DEEP SEQ Mass Spectrometry

    PubMed Central

    Zhou, Feng; Lu, Yu; Ficarro, Scott B.; Adelmant, Guillaume; Jiang, Wenyu; Luckey, C. John; Marto, Jarrod A.

    2013-01-01

    Advances in chemistry and massively parallel detection underlie DNA sequencing platforms that are poised for application in personalized medicine. In stark contrast, systematic generation of protein-level data lags well-behind genomics in virtually every aspect: depth of coverage, throughput, ease of sample preparation, and experimental time. Here, to bridge this gap, we develop an approach based on simple detergent lysis and single-enzyme digest, extreme, orthogonal separation of peptides, and true nanoflow LC-MS/MS that provides high peak capacity and ionization efficiency. This automated, deep efficient peptide sequencing and quantification (DEEP SEQ) mass spectrometry platform provides genome-scale proteome coverage equivalent to RNA-seq ribosomal profiling and accurate quantification for multiplexed isotope labels. In a model of the embryonic to epiblast transition in murine stem cells, we unambiguously quantify 11,352 gene products that span 70% of Swiss-Prot and capture protein regulation across the full detectable range of high-throughput gene expression and protein translation. PMID:23863870

  10. Quantification of biofilm biomass by staining: Non-toxic safranin can replace the popular crystal violet.

    PubMed

    Ommen, Pernille; Zobek, Natalia; Meyer, Rikke Louise

    2017-10-01

    Crystal violet staining is commonly used for quantification of biofilm formation, although it is highly toxic. Here we test safranin as a non-toxic replacement. Safranin staining provided similar results as crystal violet, but with higher reproducibility. We therefore recommend safranin staining for biofilm biomass quantification. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Advanced Quantification of Plutonium Ionization Potential to Support Nuclear Forensic Evaluations by Resonance Ionization Mass Spectrometry

    DTIC Science & Technology

    2015-06-01

    QUANTIFICATION OF PLUTONIUM IONIZATION POTENTIAL TO SUPPORT NUCLEAR FORENSIC EVALUATIONS BY RESONANCE IONIZATION MASS SPECTROMETRY by Craig T...PLUTONIUM IONIZATION POTENTIAL TO SUPPORT NUCLEAR FORENSIC EVALUATIONS BY RESONANCE IONIZATION MASS SPECTROMETRY 5. FUNDING NUMBERS 6. AUTHOR(S...mass spectrometry (RIMS) to problems related to nuclear forensics and, in particular, to the analysis and quantification of the debris from nuclear

  12. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... drug residue quantification. 530.24 Section 530.24 Food and Drugs FOOD AND DRUG ADMINISTRATION...-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a... extralabel use drug residues above the safe levels established under § 530.22 for extralabel use of an...

  13. Magnetic Resonance Spectroscopy: An Objective Technique for the Quantification of Prostate Cancer Pathologies

    DTIC Science & Technology

    2005-02-01

    breast cancer tissue. NMR Biomed 2002;15( 5 ):327-337. 25. Ala -Korpela M, Posio P, Mattila S, Korhonen A, Williams SR. Absolute quantification of...TITLE AND SUBTITLE 5 . FUNDING NUMBERS Magnetic Resonance Spectroscopy: An Objective Technique W81XWH-04-1-0190 for the Quantification of Prostate...4- 5 Key Research Accomplishments

  14. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  15. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Procedure for announcing analytical methods for...-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a) FDA may issue an order announcing a specific analytical method or methods for the quantification of...

  16. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Procedure for announcing analytical methods for...-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a) FDA may issue an order announcing a specific analytical method or methods for the quantification of...

  17. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Procedure for announcing analytical methods for...-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a) FDA may issue an order announcing a specific analytical method or methods for the quantification of...

  18. The quantification of hydrogen and methane in contaminated groundwater: validation of robust procedures for sampling and quantification.

    PubMed

    Dorgerloh, Ute; Becker, Roland; Theissen, Hubert; Nehls, Irene

    2010-10-06

    A number of currently recommended sampling techniques for the determination of hydrogen in contaminated groundwater were compared regarding the practical proficiency in field campaigns. Key characteristics of appropriate sampling procedures are reproducibility of results, robustness against varying field conditions such as hydrostatic pressure, aquifer flow, and biological activity. Laboratory set-ups were used to investigate the most promising techniques. Bubble stripping with gas sampling bulbs yielded reproducible recovery of hydrogen and methane which could be verified for groundwater sampled in two field campaigns. The methane content of the groundwater was confirmed by analysis of directly pumped samples thus supporting the trueness of the stripping results. Laboratory set-ups and field campaigns revealed that bubble stripping of hydrogen may be restricted to the type of used pump. Concentrations of dissolved hydrogen after bubble stripping with an electrically driven submersible pump were about one order of magnitude higher than those obtained from diffusion sampling. The gas chromatographic determination for hydrogen and methane requires manual injection of gas samples and detection by a pulsed discharge detector (PDD) and allows limits of quantification of 3 nM dissolved hydrogen and 1 µg L⁻¹ dissolved methane in groundwater. The combined standard uncertainty of the bubble stripping and GC/PDD quantification of hydrogen in field samples was 7% at 7.8 nM and 18% for 78 nM.

  19. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    PubMed Central

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  20. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR

    PubMed Central

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products. PMID:28771608

  1. Quantification in MALDI-TOF mass spectrometry of modified polymers.

    PubMed

    Walterová, Zuzana; Horský, Jiří

    2011-05-05

    MALDI-TOF mass spectrometry quantification is hampered by the poor reproducibility of the signal intensity and by molecular-mass and compositional discrimination. The addition of a suitable compound as an internal standard increases reproducibility and allows a calibration curve to be constructed. The concept was also verified with synthetic polymers but no instructions for practical implementation were given [H. Chen, M. He, J. Pei, H. He, Anal. Chem. 75 (2003) 6531-6535.], even though synthetic polymers are generally non-uniform with respect to molecular mass and composition and access to the polymer of the same molecular mass distribution and composition as that of the quantified one is thus the exception rather than rule. On the other hand, relative quantification of polymers e.g., the content of the precursor polymer in a batch of a modified polymer, is usually sought. In this particular case, the pure precursor is usually available and the modified polymer can serve as an internal standard. However, the calibration curve still cannot be constructed and the use of the internal standard has to be combined with the method of standard addition in which the precursor polymer is added directly to the analyzed sample. The experiments with simulated modified polymers, mixtures of poly(ethylene glycol) (PEG) and poly(ethylene glycol) monomethyl ether (MPEG) of similar molecular-mass distribution, revealed a power dependence of the PEG/MPEG signal-intensity ratio (MS ratio) on the PEG/MPEG concentrations ratio in the mixture (gravimetric ratio). The result was obtained using standard procedures and instrumentation, which means that the basic assumption of the standard-addition method, i.e., the proportionality of the MS and gravimetric ratios, generally cannot be taken for granted. Therefore, the multi-point combined internal-standard standard-addition method was developed and experimentally verified for the quantification of the precursor in modified polymers. In this

  2. Quantification of anti-Leishmania antibodies in saliva of dogs.

    PubMed

    Cantos-Barreda, Ana; Escribano, Damián; Bernal, Luis J; Cerón, José J; Martínez-Subiela, Silvia

    2017-08-15

    Detection of serum anti-Leishmania antibodies by quantitative or qualitative techniques has been the most used method to diagnose Canine Leishmaniosis (CanL). Nevertheless, saliva may represent an alternative to blood because it is easy to collect, painless and non-invasive in comparison with serum. In this study, two time-resolved immunofluorometric assays (TR-IFMAs) for quantification of anti-Leishmania IgG2 and IgA antibodies in saliva were developed and validated and their ability to distinguish Leishmania-seronegative from seropositive dogs was evaluated. The analytical study was performed by evaluation of assay precision, sensitivity and accuracy. In addition, serum from 48 dogs (21 Leishmania-seropositive and 27 Leishmania-seronegative) were analyzed by TR-IFMAs. The assays were precise, with an intra- and inter-assay coefficients of variation lower than 11%, and showed high level of accuracy, as determined by linearity under dilution (R(2)=0.99) and recovery tests (>88.60%). Anti-Leishmania IgG2 antibodies in saliva were significantly higher in the seropositive group compared with the seronegative (p<0.0001), whereas no significant differences for anti-Leishmania IgA antibodies between both groups were observed. Furthermore, TR-IFMA for quantification of anti-Leishmania IgG2 antibodies in saliva showed higher differences between seropositive and seronegative dogs than the commercial assay used in serum. In conclusion, TR-IFMAs developed may be used to quantify anti-Leishmania IgG2 and IgA antibodies in canine saliva with an adequate precision, analytical sensitivity and accuracy. Quantification of anti-Leishmania IgG2 antibodies in saliva could be potentially used to evaluate the humoral response in CanL. However, IgA in saliva seemed not to have diagnostic value for this disease. For future studies, it would be desirable to evaluate the ability of the IgG2 assay to detect dogs with subclinical disease or with low antibody titers in serum and also to study

  3. Automatic quantification of subarachnoid hemorrhage on noncontrast CT.

    PubMed

    Boers, A M; Zijlstra, I A; Gathier, C S; van den Berg, R; Slump, C H; Marquering, H A; Majoie, C B

    2014-12-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for SAH volume and density quantification. The automatic method is based on a relative density increase due to the presence of blood from different brain structures in NCCT. The method incorporates density variation due to partial volume effect, beam-hardening, and patient-specific characteristics. For validation, automatic volume and density measurements were compared with manual delineation on NCCT images of 30 patients by 2 radiologists. The agreement with the manual reference was compared with interobserver agreement by using the intraclass correlation coefficient and Bland-Altman analysis for volume and density. The automatic measurement successfully segmented the hemorrhage of all 30 patients and showed high correlation with the manual reference standard for hemorrhage volume (intraclass correlation coefficient = 0.98 [95% CI, 0.96-0.99]) and hemorrhage density (intraclass correlation coefficient = 0.80 [95% CI, 0.62-0.90]) compared with intraclass correlation coefficient = 0.97 (95% CI, 0.77-0.99) and 0.98 (95% CI, 0.89-0.99) for manual interobserver agreement. Mean SAH volume and density were, respectively, 39.3 ± 31.5 mL and 62.2 ± 5.9 Hounsfield units for automatic measurement versus 39.7 ± 32.8 mL and 61.4 ± 7.3 Hounsfield units for manual measurement. The accuracy of the automatic method was excellent, with limits of agreement of -12.9-12.1 mL and -7.6-9.2 Hounsfield units. The automatic volume and density quantification is very accurate compared with manual assessment. As such, it has the potential to provide important determinants in clinical practice and research. © 2014 by American Journal of Neuroradiology.

  4. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    PubMed

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  5. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  6. Quantification of airway deposition of intact and fragmented pollens.

    PubMed

    Horváth, Alpár; Balásházy, Imre; Farkas, Arpád; Sárkány, Zoltán; Hofmann, Werner; Czitrovszky, Aladár; Dobos, Erik

    2011-12-01

    Although pollen is one of the most widespread agents that can cause allergy, its airway transport and deposition is far from being fully explored. The objective of this study was to characterize the airway deposition of pollens and to contribute to the debate related to the increasing number of asthma attacks registered after thunderstorms. For the quantification of the deposition of inhaled pollens in the airways computer simulations were performed. Our results demonstrated that smaller and fragmented pollens may penetrate into the thoracic airways and deposit there, supporting the theory that fragmented pollen particles are responsible for the increasing incidence of asthma attacks following thunderstorms. Pollen deposition results also suggest that children are the most exposed to the allergic effects of pollens. Finally, pollens between 0.5 and 20 μm deposit more efficiently in the lung of asthmatics than in the healthy lung, especially in the bronchial region.

  7. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  8. Aspect-Oriented Programming is Quantification and Obliviousness

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  9. From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches

    PubMed Central

    Potter, Kristin; Rosen, Paul; Johnson, Chris R.

    2014-01-01

    Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly used in visualization. We then look at various approaches to visualizing this uncertainty by partitioning the work based on the dimensionality of the data and the dimensionality of the uncertainty. We also discuss noteworthy exceptions to our taxonomy along with future research directions for the uncertainty visualization community. PMID:25663949

  10. Uncertainty Quantification For Gas-Surface Interaction In Plasmatron Facility

    NASA Astrophysics Data System (ADS)

    Villedieu, N.; Panerai, F.; Chazot, O.; Magin, T. E.

    2011-08-01

    To design Thermal Protection Systems for atmospheric re-entry, it is crucial to take in account the catalytic properties of the material. At the von Karman Institute, these properties are determined by combining experiments per- formed in the Plasmatron facility and boundary layer code simulations. During this process, many uncertain- ties are involved: experimental data and physical model. The aim of this article is to develop an uncertainty quantification methodology to compute the error bars on the rebuilt enthalpy and the effective catalytic recombination coefficient due to the uncertainties on the experimental data. The purpose is also to understand which uncertainties have the largest impact on the error. We have coupled the open source software DAKOTA from Sandia National Laboratory with the VKI boundary layer code.

  11. Reliability and discriminatory power of methods for dental plaque quantification

    PubMed Central

    RAGGIO, Daniela Prócida; BRAGA, Mariana Minatel; RODRIGUES, Jonas Almeida; FREITAS, Patrícia Moreira; IMPARATO, José Carlos Pettorossi; MENDES, Fausto Medeiros

    2010-01-01

    Objective This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI) and fluorescence camera (FC) to detect plaque. Material and Methods Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. Results Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. Conclusions The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque. PMID:20485931

  12. Technological and Analytical Methods for Arabinoxylan Quantification from Cereals.

    PubMed

    Döring, Clemens; Jekle, Mario; Becker, Thomas

    2016-01-01

    Arabinoxylan (AX) is the major nonstarch polysaccharide contained in various types of grains. AX consists of a backbone of β1.4D-xylopyranosyl residues with randomly linked αlarabinofuranosyl units. Once isolated and included as food additive, AX affects foodstuff attributes and has positive effects on human health. AX can be classified into waterextractable and waterunextractable AX. For isolating AX out of their natural matrix, a range of methods was developed, adapted, and improved. This review presents a survey of the commonly used extraction methods for AX by the influence of different techniques. It also provides a brief overview of the structural and technological impact of AX as a dough additive. A concluding section summarizes different detection methods for analyzing and quantification AX.

  13. Quantification of thermophilic Campylobacter spp. in broilers during meat processing.

    PubMed

    Klein, Günter; Reich, Felix; Beckmann, Lutz; Atanassova, Viktoria

    2007-10-01

    Campylobacter spp. is a common cause of gastrointestinal illness. Since animal products, especially poultry meat, are an important source of human outbreaks of campylobacteriosis, tracing back to processing and initial production is of great interest. Samples were collected at a German poultry slaughterhouse for the estimation of the prevalence of Campylobacter at different processing steps. Quantification of Campylobacter in each of the samples was also performed. Out of 99 samples examined, 51 (51.5%) were positive for Campylobacter, with bacterial counts ranging from log(10) 6.5 cfu sample(-1) for carcasses to log 3.6 cfu ml(-1) for scalding water. The Campylobacter isolates (n = 51) were subtyped by pulsed-field gel electrophoresis using SmaI and KpnI restriction enzymes. Molecular typing showed a multitude of strains with different molecular patterns. Strains found in cloacal swabs before processing could also be isolated from carcasses at different processing steps.

  14. Sensitive digital quantification of DNA methylation in clinical samples

    PubMed Central

    Li, Meng; Chen, Wei-dong; Papadopoulos, Nickolas; Goodman, Steven; Bjerregaard, Niels Christian; Laurberg, Søren; Levin, Bernard; Juhl, Hartmut; Arber, Nadir; Moinova, Helen; Durkee, Kris; Schmidt, Kerstin; He, Yiping; Diehl, Frank; Velculescu, Victor E; Zhou, Shibin; Diaz, Luis A; Kinzler, Kenneth W; Markowitz, Sanford D; Vogelstein, Bert

    2010-01-01

    Abnormally methylated genes are increasingly being used as cancer biomarkers 1, 2. For clinical applications, it is important to precisely determine the number of methylated molecules in the analyzed sample. We here describe a digital approach that can enumerate one methylated molecule out of ~5000 unmethylated molecules. Individual DNA fragments can be amplified and analyzed either by flow cytometry or next generation sequencing instruments. Using methylated vimentin as a biomarker, we tested 191 plasma samples and detected cancer cases with 59% sensitivity (95% CI, 48%–70%) and 93% specificity (95% CI, 86%–97%). Using the same assay, we analyzed 80 stool samples and demonstrated 45% sensitivity for detecting colorectal adenomas (23%–68%), 41% sensitivity for detecting cancer (21%–64%), and 95% specificity (82%–99%). This digital quantification of rare methylation events should be applicable to diagnostic evaluations of clinical samples, to preclinical assessments of new epigenetic biomarkers, and to quantitative analyses of epigenetic biology. PMID:19684580

  15. Fluorescent Probes for H2S Detection and Quantification.

    PubMed

    Feng, Wei; Dymock, Brian W

    2015-01-01

    Many diverse, sensitive and structurally novel fluorescent probes have recently been reported for H2S detection. Quantification of H2S requires a selective chemosensor which will react only with H2S against a background of high concentrations of other thiols or reducing agents. Most published probes are able to quantify H2S selectively in a simple in vitro system with the most sensitive probes able to detect H2S at below 100 nM concentrations. A subset of probes also have utility in sensing H2S in living cells, and there are now several with specific sub-cellular localization and a few cases of in vivo applications. Biologists studying H2S now have a wide range of tools to assist them to aid further understanding of the role of H2S in biology.

  16. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  17. Quantification of chromatin condensation level by image processing.

    PubMed

    Irianto, Jerome; Lee, David A; Knight, Martin M

    2014-03-01

    The level of chromatin condensation is related to the silencing/activation of chromosomal territories and therefore impacts on gene expression. Chromatin condensation changes during cell cycle, progression and differentiation, and is influenced by various physicochemical and epigenetic factors. This study describes a validated experimental technique to quantify chromatin condensation. A novel image processing procedure is developed using Sobel edge detection to quantify the level of chromatin condensation from nuclei images taken by confocal microscopy. The algorithm was developed in MATLAB and used to quantify different levels of chromatin condensation in chondrocyte nuclei achieved through alteration in osmotic pressure. The resulting chromatin condensation parameter (CCP) is in good agreement with independent multi-observer qualitative visual assessment. This image processing technique thereby provides a validated unbiased parameter for rapid and highly reproducible quantification of the level of chromatin condensation. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. In situ quantification and visualization of lithium transport with neutrons.

    PubMed

    Liu, Danny X; Wang, Jinghui; Pan, Ke; Qiu, Jie; Canova, Marcello; Cao, Lei R; Co, Anne C

    2014-09-01

    A real-time quantification of Li transport using a nondestructive neutron method to measure the Li distribution upon charge and discharge in a Li-ion cell is reported. By using in situ neutron depth profiling (NDP), we probed the onset of lithiation in a high-capacity Sn anode and visualized the enrichment of Li atoms on the surface followed by their propagation into the bulk. The delithiation process shows the removal of Li near the surface, which leads to a decreased coulombic efficiency, likely because of trapped Li within the intermetallic material. The developed in situ NDP provides exceptional sensitivity in the temporal and spatial measurement of Li transport within the battery material. This diagnostic tool opens up possibilities to understand rates of Li transport and their distribution to guide materials development for efficient storage mechanisms. Our observations provide important mechanistic insights for the design of advanced battery materials.

  19. Raman spectroscopy for DNA quantification in cell nucleus.

    PubMed

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate.

  20. Regulation and quantification of cellular mitochondrial morphology and content.

    PubMed

    Tronstad, Karl J; Nooteboom, Marco; Nilsson, Linn I H; Nikolaisen, Julie; Sokolewicz, Maciek; Grefte, Sander; Pettersen, Ina K N; Dyrstad, Sissel; Hoel, Fredrik; Willems, Peter H G M; Koopman, Werner J H

    2014-01-01

    Mitochondria play a key role in signal transduction, redox homeostasis and cell survival, which extends far beyond their classical functioning in ATP production and energy metabolism. In living cells, mitochondrial content ("mitochondrial mass") depends on the cell-controlled balance between mitochondrial biogenesis and degradation. These processes are intricately linked to changes in net mitochondrial morphology and spatiotemporal positioning ("mitochondrial dynamics"), which are governed by mitochondrial fusion, fission and motility. It is becoming increasingly clear that mitochondrial mass and dynamics, as well as its ultrastructure and volume, are mechanistically linked to mitochondrial function and the cell. This means that proper quantification of mitochondrial morphology and content is of prime importance in understanding mitochondrial and cellular physiology in health and disease. This review first presents how cellular mitochondrial content is regulated at the level of mitochondrial biogenesis, degradation and dynamics. Next we discuss how mitochondrial dynamics and content can be analyzed with a special emphasis on quantitative live-cell microscopy strategies.

  1. Ash Fusion Quantification by Means of Thermal Analysis

    NASA Astrophysics Data System (ADS)

    Hansen, Lone A.; Frandsen, Flemming J.; Dam-Johansen, Kim

    A new experimental method for quantification of ash melting has been developed. Using the new method, a conventional STA apparatus is employed, and the melting is detected as endothermic reactions involving no change in mass. The DSC signal is transferred into a melting curve (showing the melt fraction in the ash as a function of temperature) either by simple comparison of the areas below the melting curve or by accounting for the relevant melting enthalpies. The execution of the measurement is simple and the repeatability of the results is very good. The subsequent conversion of the STA curves to a melting curve requires knowledge of the identity of chemical species in the ash and the involved chemistry.

  2. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    PubMed

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported.

  3. Automated quantification of lung structures from optical coherence tomography images

    PubMed Central

    Pagnozzi, Alex M.; Kirk, Rodney W.; Kennedy, Brendan F.; Sampson, David D.; McLaughlin, Robert A.

    2013-01-01

    Characterization of the size of lung structures can aid in the assessment of a range of respiratory diseases. In this paper, we present a fully automated segmentation and quantification algorithm for the delineation of large numbers of lung structures in optical coherence tomography images, and the characterization of their size using the stereological measure of median chord length. We demonstrate this algorithm on scans acquired with OCT needle probes in fresh, ex vivo tissues from two healthy animal models: pig and rat. Automatically computed estimates of lung structure size were validated against manual measures. In addition, we present 3D visualizations of the lung structures using the segmentation calculated for each data set. This method has the potential to provide an in vivo indicator of structural remodeling caused by a range of respiratory diseases, including chronic obstructive pulmonary disease and pulmonary fibrosis. PMID:24298402

  4. Optimization of encoded hydrogel particles for nucleic acid quantification.

    PubMed

    Pregibon, Daniel C; Doyle, Patrick S

    2009-06-15

    The accurate quantification of nucleic acids is of utmost importance for clinical diagnostics, drug discovery, and basic science research. These applications require the concurrent measurement of multiple targets while demanding high-throughput analysis, high sensitivity, specificity between closely related targets, and a wide dynamic range. In attempt to create a technology that can simultaneously meet these demands, we recently developed a method of multiplexed analysis using encoded hydrogel particles. Here, we demonstrate tuning of hydrogel porosity with semi-interpenetrating networks of poly(ethylene glycol), develop a quantitative model to understand hybridization kinetics, and use the findings from these studies to enhance particle design for nucleic acid detection. With an optimized particle design and efficient fluorescent labeling scheme, we demonstrate subattomole sensitivity and single-nucleotide specificity for small RNA targets.

  5. Enhanced techniques for asymmetry quantification in brain imagery

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Imielinska, Celina; Rosiene, Joel; Connolly, E. S.; D'Ambrosio, Anthony L.

    2006-03-01

    We present an automated generic methodology for symmetry identification and asymmetry quantification, novel method of identifying and delineation of brain pathology by analyzing the opposing sides of the brain utilizing of inherent left-right symmetry in the brain. After symmetry axis has been detected, we apply non-parametric statistical tests operating on the pairs of samples to identify initial seeds points which is defined defined as the pixels where the most statistically significant difference appears. Local region growing is performed on the difference map, from where the seeds are aggregating until it captures all 8-way connected high signals from the difference map. We illustrate the capability of our method with examples ranging from tumors in patient MR data to animal stroke data. The validation results on Rat stroke data have shown that this approach has promise to achieve high precision and full automation in segmenting lesions in reflectional symmetrical objects.

  6. Quantification of osteolytic bone lesions in a preclinical rat trial

    NASA Astrophysics Data System (ADS)

    Fränzle, Andrea; Bretschi, Maren; Bäuerle, Tobias; Giske, Kristina; Hillengass, Jens; Bendl, Rolf

    2013-10-01

    In breast cancer, most of the patients who died, have developed bone metastasis as disease progression. Bone metastases in case of breast cancer are mainly bone destructive (osteolytic). To understand pathogenesis and to analyse response to different treatments, animal models, in our case rats, are examined. For assessment of treatment response to bone remodelling therapies exact segmentations of osteolytic lesions are needed. Manual segmentations are not only time-consuming but lack in reproducibility. Computerized segmentation tools are essential. In this paper we present an approach for the computerized quantification of osteolytic lesion volumes using a comparison to a healthy reference model. The presented qualitative and quantitative evaluation of the reconstructed bone volumes show, that the automatically segmented lesion volumes complete missing bone in a reasonable way.

  7. Detection and Quantification of Mitochondrial Fusion Using Imaging Flow Cytometry.

    PubMed

    Nascimento, Aldo; Lannigan, Joanne; Kashatus, David

    2017-07-05

    Mitochondria are dynamic organelles that perform several vital cellular functions. Requisite for these functions are mitochondrial fusion and fission. Despite the increasing importance of mitochondrial dynamics in a range of cellular processes, there exist limited methods for robust quantification of mitochondrial fission and fusion. Currently, the most widely used method to measure mitochondrial fusion is the polyethylene glycol (PEG) fusion assay. While this assay can provide useful information regarding fusion activity, the reliance on manual selection of rare fusion events is time consuming and may introduce selection bias. By utilizing the image-capture features and colocalization analysis of imaging flow cytometry in combination with the PEG fusion assay, we are able to develop a high-throughput method to detect and quantify mitochondrial fusion activity. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  8. [Quantification in psychiatry: from psychometrics to quantitative psychiatry].

    PubMed

    Pichot, P

    1994-01-01

    The development of quantitative techniques to analyse psychopathological states is reviewed from the XVIIIth Century till today. As far as back as the XIXth Century, Quetelet, Louis and Galton introduced and advocated the use of quantitative methods in medical and psychological sciences. The advent of psychometry dates back 1905, when Alfred Binet published his Intelligence Scale. The construction of instruments like Wechsler and MMPI scales in the forties starts using psychometry in psychiatry. At end of World War II, historical factors (selection and guidance of military recruits) in conjunction with technical advancements (beginning of psychopharmacology, multivariate statistics development and first computers arrival) favor the growth of quantitative psychopathology that further takes four great different courses: 1. Psychometry proper, 2. Symptom-quantifying assessment scales such as BPRS or Hamilton scales, 3. New nosological models constructed using quantified psychopathological data and mathematical procedures, 4. Diagnostic systems relying on operationalized criteria based on psychopathological quantification, such as DSM III.

  9. Quantification of Diffuse Hydrothermal Flows Using Multibeam Sonar

    NASA Astrophysics Data System (ADS)

    Ivakin, A. N.; Jackson, D. R.; Bemis, K. G.; Xu, G.

    2014-12-01

    The Cabled Observatory Vent Imaging Sonar (COVIS) deployed at the Main Endeavour node of the NEPTUNE Canada observatory has provided acoustic time series extending over 2 years. This includes 3D images of plume scattering strength and Doppler velocity measurements as well as 2D images showing regions of diffuse flow. The diffuse-flow images display the level of decorrelation between sonar echos with transmissions separated by 0.2 s. The present work aims to provide further information on the strength of diffuse flows. Two approaches are used: Measurement of the dependence of decorrelation on lag and measurement of phase shift of sonar echos, with lags in 3-hour increments up to several days. The phase shifts and decorrelation are linked to variations of temperature above the seabed, which allows quantification of those variations, their magnitudes, spatial and temporal scales, and energy spectra. These techniques are illustrated using COVIS data obtained near the Grotto vent complex.

  10. Quantification of human angiotensinogen by a novel sandwich ELISA

    PubMed Central

    Suzaki, Yuki; Ozawa, Yuri; Kobori, Hiroyuki

    2007-01-01

    The urinary angiotensinogen excretion rates show a clear relationship to kidney angiotensin II content, suggesting that urinary angiotensinogen may serve as an index of angiotensin II-dependent hypertensive rats. However, simple and accurate methods to measure human angiotensinogen are unavailable at this time. We have developed two antibodies and a sensitive and specific quantification ELISA system for human angiotensinogen to be applicable to human subjects. The ELISA is able to detect human angiotensinogen at range of 0.01–1 μg/well (R2 = 0.9945) using standard ELISA plates. This ELISA will be a useful tool to investigate the relationship between urinary angiotensinogen excretion rates and reactivity to antihypertensive drugs in hypertensive human subjects. PMID:16793172

  11. On the Quantification of Incertitude in Astrophysical Simulation Codes

    NASA Astrophysics Data System (ADS)

    Hoffman, Melissa; Katz, Maximilian P.; Willcox, Donald E.; Ferson, Scott; Swesty, F. Douglas; Calder, Alan

    2017-01-01

    We present a pedagogical study of uncertainty quantification (UQ) due to epistemic uncertainties (incertitude) in astrophysical modeling using the stellar evolution software instrument MESA (Modules and Experiments for Stellar Astrophysics). We present a general methodology for UQ and examine the specific case of stars evolving from the main sequence to carbon/oxygen white dwarfs. Our study considers two epistemic variables: the wind parameters during the Red Giant and Asymptotic Giant branch phases of evolution. We choose uncertainty intervals for each variable, and use these as input to MESA simulations. Treating MESA as a "black box," we apply two UQ techniques, Cauchy deviates and Quadratic Response Surface Models, to obtain bounds for the final white dwarf masses. Our study is a proof of concept applicable to other computational problems to enable a more robust understanding of incertitude. This work was supported in part by the US Department of Energy under grant DE-FG02-87ER40317.

  12. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  13. Segmentation and quantification of adipose tissue by magnetic resonance imaging

    PubMed Central

    Chen, Jun; Shen, Wei

    2016-01-01

    In this brief review, introductory concepts in animal and human adipose tissue segmentation using proton magnetic resonance imaging (MRI) and computed tomography are summarized in the context of obesity research. Adipose tissue segmentation and quantification using spin relaxation-based (e.g., T1-weighted, T2-weighted), relaxometry-based (e.g., T1-, T2-, T2*-mapping), chemical-shift selective, and chemical-shift encoded water–fat MRI pulse sequences are briefly discussed. The continuing interest to classify subcutaneous and visceral adipose tissue depots into smaller sub-depot compartments is mentioned. The use of a single slice, a stack of slices across a limited anatomical region, or a whole body protocol is considered. Common image post-processing steps and emerging atlas-based automated segmentation techniques are noted. Finally, the article identifies some directions of future research, including a discussion on the growing topic of brown adipose tissue and related segmentation considerations. PMID:26336839

  14. Quantification of intracerebral steal in patients with arteriovenous malformation

    SciTech Connect

    Homan, R.W.; Devous, M.D. Sr.; Stokely, E.M.; Bonte, F.J.

    1986-08-01

    Eleven patients with angiographically and/or pathologically proved arteriovenous malformations (AVMs) were studied using dynamic, single-photon-emission computed tomography (DSPECT). Quantification of regional cerebral blood flow in structurally normal areas remote from the AVM disclosed areas of decreased flow compared with normal controls in eight of 11 patients examined. Areas of hypoperfusion correlated with altered function as manifested by epileptogenic foci and impaired cognitive function. Dynamic, single-photon-emission computed tomography provides a noninvasive technique to monitor quantitatively hemodynamic changes associated with AVMs. Our findings suggest that such changes are present in the majority of patients with AVMs and that they may be clinically significant. The potential application of regional cerebral blood flow imaging by DSPECT in the management of patients with AVMs is discussed.

  15. Fluorimetric quantification of brimonidine tartrate in eye drops.

    PubMed

    Sunitha, G; Bhagirath, R; Alapati, V R; Ramakrishna, K; Subrahmanyam, C V S; Anumolu, P D

    2013-11-01

    A simple and sensitive spectrofluorimetric method has been developed for the estimation of brimonidine tartrate in pure and eye drops. Linearity was obeyed in the range of 0.2-3.0 ΅g/ml in dimethyl formamide as solvent at an emission wavelength (λem) of 530 nm after excitation wavelength (λex) of 389 nm with good correlation coefficient of 0.998. The limit of detection and limit of quantification for this method were 22.0 and 72.0 ng/ml, respectively. The developed method was statistically validated as per International Conference on Harmonisation guidelines. The percentage relative standard deviation values were found to be less than 2 for accuracy and precision studies. The results obtained were in good agreement with the labelled amounts of the marketed formulations. The proposed method was effectively applied to routine quality control analysis of brimonidine tartrate in their eye drops.

  16. Quantification of Ceroid and Lipofuscin in Skeletal Muscle

    PubMed Central

    Tohma, Hatice; Hepworth, Anna R.; Shavlakadze, Thea; Grounds, Miranda D.; Arthur, Peter G.

    2011-01-01

    Ceroid and lipofuscin are autofluorescent granules thought to be generated as a consequence of chronic oxidative stress. Because ceroid and lipofuscin are persistent in tissue, their measurement can provide a lifetime history of exposure to chronic oxidative stress. Although ceroid and lipofuscin can be measured by quantification of autofluorescent granules, current methods rely on subjective assessment. Furthermore, there has not been any evaluation of variables affecting quantitative measurements. The article describes a simple statistical approach that can be readily applied to quantitate ceroid and lipofuscin. Furthermore, it is shown that several factors, including magnification tissue thickness and tissue level, can affect precision and sensitivity. After optimizing for these factors, the authors show that ceroid and lipofuscin can be measured reproducibly in the skeletal muscle of dystrophic mice (ceroid) and aged mice (lipofuscin). PMID:21804079

  17. Performance and Limitations of Phosphate Quantification: Guidelines for Plant Biologists.

    PubMed

    Kanno, Satomi; Cuyas, Laura; Javot, Hélène; Bligny, Richard; Gout, Elisabeth; Dartevelle, Thibault; Hanchi, Mohamed; Nakanishi, Tomoko M; Thibaud, Marie-Christine; Nussaume, Laurent

    2016-04-01

    Phosphate (Pi) is a macronutrient that is essential for plant life. Several regulatory components involved in Pi homeostasis have been identified, revealing a very high complexity at the cellular and subcellular levels. Determining the Pi content in plants is crucial to understanding this regulation, and short real-time(33)Pi uptake imaging experiments have shown Pi movement to be highly dynamic. Furthermore, gene modulation by Pi is finely controlled by localization of this ion at the tissue as well as the cellular and subcellular levels. Deciphering these regulations requires access to and quantification of the Pi pool in the various plant compartments. This review presents the different techniques available to measure, visualize and trace Pi in plants, with a discussion of the future prospects.

  18. Parallel quantification of lectin-glycan interaction using ultrafiltration.

    PubMed

    Takeda, Yoichi; Seko, Akira; Sakono, Masafumi; Hachisu, Masakazu; Koizumi, Akihiko; Fujikawa, Kohki; Ito, Yukishige

    2013-06-28

    Using ultrafiltration membrane, a simple method for screening protein-ligand interaction was developed. The procedure comprises three steps: mixing ligand with protein, ultrafiltration of the solution, and quantification of unbound ligands by HPLC. By conducting analysis with variable protein concentrations, affinity constants were easily obtained. Multiple ligands can be analyzed simultaneously as a mixture, when concentration of ligands was controlled. Feasibility of this method for lectin-glycan interaction analysis was examined using fluorescently labeled high-mannose-type glycans and recombinant intracellular lectins or endo-α-mannosidase mutants. Estimated Ka values of malectin and VIP36 were in good agreement indeed with those evaluated by conventional methods such as isothermal titration calorimetry (ITC) or frontal affinity chromatography (FAC). Finally, several mutants of endo-α-mannosidase were produced and their affinities to monoglucosylated glycans were evaluated.

  19. Experimental investigations for uncertainty quantification in brake squeal analysis

    NASA Astrophysics Data System (ADS)

    Renault, A.; Massa, F.; Lallemand, B.; Tison, T.

    2016-04-01

    The aim of this paper is to improve the correlation between the experimental and the numerical prediction of unstable frequencies for automotive brake systems considering uncertainty. First, an experimental quantification of uncertainty and a discussion analysing the contributions of uncertainty to a numerical squeal simulation are proposed. Frequency and transient simulations are performed considering nominal values of model parameters, determined experimentally. The obtained results are compared with those derived from experimental tests to highlight the limitation of deterministic simulations. The effects of the different kinds of uncertainty detected in working conditions of brake system, the pad boundary condition, the brake system material properties and the pad surface topography are discussed by defining different unstable mode classes. Finally, a correlation between experimental and numerical results considering uncertainty is successfully proposed for an industrial brake system. Results from the different comparisons reveal also a major influence of the pad topography and consequently the contact distribution.

  20. Cytofluorometric Quantification of Cell Death Elicited by NLR Proteins.

    PubMed

    Sica, Valentina; Manic, Gwenola; Kroemer, Guido; Vitale, Ilio; Galluzzi, Lorenzo

    2016-01-01

    Nucleotide-binding domain and leucine-rich repeat containing (NLR) proteins, also known as NOD-like receptors, are critical components of the molecular machinery that senses intracellular danger signals to initiate an innate immune response against invading pathogens or endogenous sources of hazard. The best characterized effect of NLR signaling is the secretion of various cytokines with immunostimulatory effects, including interleukin (IL)-1β and IL-18. Moreover, at least under specific circumstances, NLRs can promote regulated variants of cell death. Here, we detail two protocols for the cytofluorometric quantification of cell death-associated parameters that can be conveniently employed to assess the lethal activity of specific NLRs or their ligands.

  1. Quantification of tidal parameters from Solar System data

    NASA Astrophysics Data System (ADS)

    Lainey, Valéry

    2016-11-01

    Tidal dissipation is the main driver of orbital evolution of natural satellites and a key point to understand the exoplanetary system configurations. Despite its importance, its quantification from observations still remains difficult for most objects of our own Solar System. In this work, we overview the method that has been used to determine, directly from observations, the tidal parameters, with emphasis on the Love number k_2 and the tidal quality factor Q. Up-to-date values of these tidal parameters are summarized. Last, an assessment on the possible determination of the tidal ratio k_2/Q of Uranus and Neptune is done. This may be particularly relevant for coming astrometric campaigns and future space missions focused on these systems.

  2. Graphene wrinkling induced by monodisperse nanoparticles: facile control and quantification

    PubMed Central

    Vejpravova, Jana; Pacakova, Barbara; Endres, Jan; Mantlikova, Alice; Verhagen, Tim; Vales, Vaclav; Frank, Otakar; Kalbac, Martin

    2015-01-01

    Controlled wrinkling of single-layer graphene (1-LG) at nanometer scale was achieved by introducing monodisperse nanoparticles (NPs), with size comparable to the strain coherence length, underneath the 1-LG. Typical fingerprint of the delaminated fraction is identified as substantial contribution to the principal Raman modes of the 1-LG (G and G’). Correlation analysis of the Raman shift of the G and G’ modes clearly resolved the 1-LG in contact and delaminated from the substrate, respectively. Intensity of Raman features of the delaminated 1-LG increases linearly with the amount of the wrinkles, as determined by advanced processing of atomic force microscopy data. Our study thus offers universal approach for both fine tuning and facile quantification of the graphene topography up to ~60% of wrinkling. PMID:26530787

  3. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  4. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  5. Thermostability of Biological Systems: Fundamentals, Challenges, and Quantification

    PubMed Central

    He, Xiaoming

    2011-01-01

    This review examines the fundamentals and challenges in engineering/understanding the thermostability of biological systems over a wide temperature range (from the cryogenic to hyperthermic regimen). Applications of the bio-thermostability engineering to either destroy unwanted or stabilize useful biologicals for the treatment of diseases in modern medicine are first introduced. Studies on the biological responses to cryogenic and hyperthermic temperatures for the various applications are reviewed to understand the mechanism of thermal (both cryo and hyperthermic) injury and its quantification at the molecular, cellular and tissue/organ levels. Methods for quantifying the thermophysical processes of the various applications are then summarized accounting for the effect of blood perfusion, metabolism, water transport across cell plasma membrane, and phase transition (both equilibrium and non-equilibrium such as ice formation and glass transition) of water. The review concludes with a summary of the status quo and future perspectives in engineering the thermostability of biological systems. PMID:21769301

  6. Aliphatic dipeptide tags for multi-2-plex protein quantification.

    PubMed

    Suh, Min-Soo; Seo, Jongcheol; Thangadurai, T D; Rhee, Young Ho; Shin, Seung Koo; Yoon, Hye-Joo

    2011-04-21

    Mass-balanced (1)H/(2)H-isotope dipeptide tag (MBIT) is diversified as aliphatic tags for multiplexed protein quantification. Aliphatic MBITs are based on the N-acetyl-Xxx-Ala dipeptide, where Xxx is an artificial amino acid with a linear alkyl side chain from C(2)H(5) to C(8)H(17) (C(2)-C(8) tags). (1)H/(2)H isotopes are encoded in the methyl groups of N-acetyl and Ala to yield a pair of isobaric tags with 2-plex quantitation signals separated by 3 Da. C(2)-C(5) tags are prepared by solid-phase synthesis, while C(6)-C(8) tags are synthesized by olefin metathesis in solution. These aliphatic tags are made reactive toward the primary amines of peptides, and the relative abundances of quantitation signals are characterized using both matrix-assisted laser desorption ionization and electrospray ionization tandem mass spectrometry. MBIT-linked peptides co-migrate in reverse-phase liquid chromatography (LC), and their tandem mass spectra exhibit 2-plex quantitation signals as well as sequence ions in similar abundances. As the length of alkyl side chain increases, C(2)-C(8) tags show a stepwise increase in both the LC retention time and the relative abundance of quantitation signals. In addition, the quantitation linearity is well-maintained in a 15-250 fmol range. The multiplexing capability of aliphatic MBITs is demonstrated by applying three different tags (C(6)-C(8) tags) to the quantification of yeast heat shock proteins expressed under four different physiological conditions.

  7. Quantification of the genetic risk of environmental mutagens

    SciTech Connect

    Ehling, U.H.

    1988-03-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens.

  8. Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis

    PubMed Central

    Zhao, Lu; Chanon, Ann M.; Chattopadhyay, Nabanita; Dami, Imed E.; Blakeslee, Joshua J.

    2016-01-01

    Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography–mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars. PMID:27379118

  9. Longitudinal MRI quantification of muscle degeneration in Duchenne muscular dystrophy.

    PubMed

    Godi, Claudia; Ambrosi, Alessandro; Nicastro, Francesca; Previtali, Stefano C; Santarosa, Corrado; Napolitano, Sara; Iadanza, Antonella; Scarlato, Marina; Natali Sora, Maria Grazia; Tettamanti, Andrea; Gerevini, Simonetta; Cicalese, Maria Pia; Sitzia, Clementina; Venturini, Massimo; Falini, Andrea; Gatti, Roberto; Ciceri, Fabio; Cossu, Giulio; Torrente, Yvan; Politi, Letterio S

    2016-08-01

    The aim of this study was to evaluate the usefulness of magnetic resonance imaging (MRI) in detecting the progression of Duchenne muscular dystrophy (DMD) by quantification of fat infiltration (FI) and muscle volume index (MVI, a residual-to-total muscle volume ratio). Twenty-six patients (baseline age: 5-12 years) with genetically proven DMD were longitudinally analyzed with lower limb 3T MRI, force measurements, and functional tests (Gowers, 10-m time, North Star Ambulatory Assessment, 6-min walking test). Five age-matched controls were also examined, with a total of 85 MRI studies. Semiquantitative (scores) and quantitative MRI (qMRI) analyses (signal intensity ratio - SIR, lower limb MVI, and individual muscle MVI) were carried out. Permutation and regression analyses according to both age and functional test-outcomes were calculated. Age-related quantitative reference curves of SIRs and MVIs were generated. FI was present on glutei and adductor magnus in all patients since the age of 5, with a proximal-to-distal progression and selective sparing of sartorius and gracilis. Patients' qMRI measures were significantly different from controls' and among age classes. qMRI were more sensitive than force measurements and functional tests in assessing disease progression, allowing quantification also after loss of ambulation. Age-related curves with percentile values were calculated for SIRs and MVIs, to provide a reference background for future experimental therapy trials. SIRs and MVIs significantly correlated with all clinical measures, and could reliably predict functional outcomes and loss of ambulation. qMRI-based indexes are sensitive measures that can track the progression of DMD and represent a valuable tool for follow-up and clinical studies.

  10. Evaluation of computer-assisted quantification of carotid artery stenosis.

    PubMed

    Biermann, Christina; Tsiflikas, Ilias; Thomas, Christoph; Kasperek, Bernadette; Heuschmid, Martin; Claussen, Claus D

    2012-04-01

    The purpose of this study was to evaluate the influence of advanced software assistance on the assessment of carotid artery stenosis; particularly, the inter-observer variability of readers with different level of experience is to be investigated. Forty patients with suspected carotid artery stenosis received head and neck dual-energy CT angiography as part of their pre-interventional workup. Four blinded readers with different levels of experience performed standard imaging interpretation. At least 1 day later, they performed quantification using an advanced vessel analysis software including automatic dual-energy bone and hard plaque removal, automatic and semiautomatic vessel segmentation, as well as creation of curved planar reformation. Results were evaluated for the reproducibility of stenosis quantification of different readers by calculating the kappa and correlation values. Consensus reading of the two most experienced readers was used as the standard of reference. For standard imaging interpretation, experienced readers reached very good (k = 0.85) and good (k = 0.78) inter-observer variability. Inexperienced readers achieved moderate (k = 0.6) and fair (k = 0.24) results. Sensitivity values 80%, 91%, 83%, 77% and specificity values 100%, 84%, 82%, 53% were achieved for significant area stenosis >70%. For grading using advanced vessel analysis software, all readers achieved good inter-observer variability (k = 0.77, 0.72, 0.71, and 0.77). Specificity values of 97%, 95%, 95%, 93% and sensitivity values of 84%, 78%, 86%, 92% were achieved. In conclusion, when supported by advanced vessel analysis software, experienced readers are able to achieve good reproducibility. Even inexperienced readers are able to achieve good results in the assessment of carotid artery stenosis when using advanced vessel analysis software.

  11. Uncertainty Quantification for GPM-era Precipitation Measurements

    NASA Astrophysics Data System (ADS)

    Tian, Yudong

    2014-05-01

    Uncertainty quantification will remain a challenge for GPM-era precipitation measurements. Our studies with TRMM-era products can provide useful guidance and improved procedures. For satellite-borne precipitation measurements, uncertainty originates from many error sources, including sampling errors, systematic errors and random errors. This presentation summarizes our efforts to quantify these errors in six different TRMM-era precipitation products (3B42, 3B42RT, CMORPH, PERSIANN, NRL and GSMaP), and proposes improved error modeling and validation procedures for GPM-era products. For systematic errors, we devised an error decomposition scheme to separate errors in precipitation estimates into three independent components, hit biases, missed precipitation and false precipitation (Tian et al., 2009). This decomposition scheme reveals more error features and provides a better link to the error sources than conventional analysis, because in the latter these error components tend to cancel one another when aggregated or averaged in space or time. To evaluate the random errors, we calculated the measurement spread from the ensemble of these six quasi-independent products, and produced a global map of measurement uncertainties (Tian and Peters-Lidard, 2010). The map yields a global view of the error characteristics and their regional and seasonal variations. More recently, we have established the fitness of a multiplicative error model to predict the uncertainties when ground validation data are not available (Tian et al., 2013), and have shown that this model is superior to the commonly-used additive error model in describing and predicting the uncertainty in precipitation measurements. Thus we propose an improved procedure based on error decomposition and the multiplicative error model for GPM-era uncertainty quantification.

  12. Quantification and localization of mast cells in periapical lesions.

    PubMed

    Mahita, V N; Manjunatha, B S; Shah, R; Astekar, M; Purohit, S; Kovvuru, S

    2015-01-01

    Periapical lesions occur in response to chronic irritation in periapical tissue, generally resulting from an infected root canal. Specific etiological agents of induction, participating cell population and growth factors associated with maintenance and resolution of periapical lesions are incompletely understood. Among the cells found in periapical lesions, mast cells have been implicated in the inflammatory mechanism. Quantifications and the possible role played by mast cells in the periapical granuloma and radicular cyst. Hence, this study is to emphasize the presence (localization) and quantification of mast cells in periapical granuloma and radicular cyst. A total of 30 cases and out of which 15 of periapical granuloma and 15 radicular cyst, each along with the case details from the previously diagnosed cases in the department of oral pathology were selected for the study. The gender distribution showed male 8 (53.3%) and females 7 (46.7%) in periapical granuloma cases and male 10 (66.7%) and females 5 (33.3%) in radicular cyst cases. The statistical analysis used was unpaired t-test. Mean mast cell count in periapical granuloma subepithelial and deeper connective tissue, was 12.40 (0.99%) and 7.13 (0.83%), respectively. The mean mast cell counts in subepithelial and deeper connective tissue of radicular cyst were 17.64 (1.59%) and 12.06 (1.33%) respectively, which was statistically significant. No statistical significant difference was noted among males and females. Mast cells were more in number in radicular cyst. Based on the concept that mast cells play a critical role in the induction of inflammation, it is logical to use therapeutic agents to alter mast cell function and secretion, to thwart inflammation at its earliest phases. These findings may suggest the possible role of mast cells in the pathogenesis of periapical lesions.

  13. Quantification and Localization of Mast Cells in Periapical Lesions

    PubMed Central

    Mahita, VN; Manjunatha, BS; Shah, R; Astekar, M; Purohit, S; Kovvuru, S

    2015-01-01

    Background: Periapical lesions occur in response to chronic irritation in periapical tissue, generally resulting from an infected root canal. Specific etiological agents of induction, participating cell population and growth factors associated with maintenance and resolution of periapical lesions are incompletely understood. Among the cells found in periapical lesions, mast cells have been implicated in the inflammatory mechanism. Aim: Quantifications and the possible role played by mast cells in the periapical granuloma and radicular cyst. Hence, this study is to emphasize the presence (localization) and quantification of mast cells in periapical granuloma and radicular cyst. Materials and Methods: A total of 30 cases and out of which 15 of periapical granuloma and 15 radicular cyst, each along with the case details from the previously diagnosed cases in the department of oral pathology were selected for the study. The gender distribution showed male 8 (53.3%) and females 7 (46.7%) in periapical granuloma cases and male 10 (66.7%) and females 5 (33.3%) in radicular cyst cases. The statistical analysis used was unpaired t-test. Results: Mean mast cell count in periapical granuloma subepithelial and deeper connective tissue, was 12.40 (0.99%) and 7.13 (0.83%), respectively. The mean mast cell counts in subepithelial and deeper connective tissue of radicular cyst were 17.64 (1.59%) and 12.06 (1.33%) respectively, which was statistically significant. No statistical significant difference was noted among males and females. Conclusion: Mast cells were more in number in radicular cyst. Based on the concept that mast cells play a critical role in the induction of inflammation, it is logical to use therapeutic agents to alter mast cell function and secretion, to thwart inflammation at its earliest phases. These findings may suggest the possible role of mast cells in the pathogenesis of periapical lesions. PMID:25861530

  14. Quantification of regional fat volume in rat MRI

    NASA Astrophysics Data System (ADS)

    Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren

    2003-05-01

    Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been

  15. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  16. The applications of statistical quantification techniques in nanomechanics and nanoelectronics.

    PubMed

    Mai, Wenjie; Deng, Xinwei

    2010-10-08

    Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f(0). The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 +/- 0.0274 x 10( - 5) ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.

  17. Rapid quantification method for Legionella pneumophila in surface water.

    PubMed

    Wunderlich, Anika; Torggler, Carmen; Elsässer, Dennis; Lück, Christian; Niessner, Reinhard; Seidel, Michael

    2016-03-01

    World-wide legionellosis outbreaks caused by evaporative cooling systems have shown that there is a need for rapid screening methods for Legionella pneumophila in water. Antibody-based methods for the quantification of L. pneumophila are rapid, non-laborious, and relatively cheap but not sensitive enough for establishment as a screening method for surface and drinking water. Therefore, preconcentration methods have to be applied in advance to reach the needed sensitivity. In a basic test, monolithic adsorption filtration (MAF) was used as primary preconcentration method that adsorbs L. pneumophila with high efficiency. Ten-liter water samples were concentrated in 10 min and further reduced to 1 mL by centrifugal ultrafiltration (CeUF). The quantification of L. pneumophila strains belonging to the monoclonal subtype Bellingham was performed via flow-based chemiluminescence sandwich microarray immunoassays (CL-SMIA) in 36 min. The whole analysis process takes 90 min. A polyclonal antibody (pAb) against L. pneumophila serogroup 1-12 and a monoclonal antibody (mAb) against L. pneumophila SG 1 strain Bellingham were immobilized on a microarray chip. Without preconcentration, the detection limit was 4.0 × 10(3) and 2.8 × 10(3) CFU/mL determined by pAb and mAb 10/6, respectively. For samples processed by MAF-CeUF prior to SMIA detection, the limit of detection (LOD) could be decreased to 8.7 CFU/mL and 0.39 CFU/mL, respectively. A recovery of 99.8 ± 15.9% was achieved for concentrations between 1-1000 CFU/mL. The established combined analytical method is sensitive for rapid screening of surface and drinking water to allow fast hygiene control of L. pneumophila.

  18. Nanoscale elemental quantification in heterostructured SiGe nanowires

    NASA Astrophysics Data System (ADS)

    Hourani, W.; Periwal, P.; Bassani, F.; Baron, T.; Patriarche, G.; Martinez, E.

    2015-04-01

    The nanoscale chemical characterization of axial heterostructured Si1-xGex nanowires (NWs) has been performed using scanning Auger microscopy (SAM) through local spectroscopy, line-scan and depth profile measurements. Local Auger profiles are realized with sufficient lateral resolution to resolve individual nanowires. Axial and radial composition heterogeneities are highlighted. Our results confirm the phenomenon of Ge radial growth forming a Ge shell around the nanowire. Moreover, quantification is performed after verifying the absence of preferential sputtering of Si or Ge on a bulk SiGe sample. Hence, reliable results are obtained for heterostructured NW diameters higher than 100 nm. However, for smaller sizes, we have noticed that the sensitivity factors evaluated from bulk samples cannot be used because of edge effects occurring for highly topographical features and a modified contribution of backscattered electrons.The nanoscale chemical characterization of axial heterostructured Si1-xGex nanowires (NWs) has been performed using scanning Auger microscopy (SAM) through local spectroscopy, line-scan and depth profile measurements. Local Auger profiles are realized with sufficient lateral resolution to resolve individual nanowires. Axial and radial composition heterogeneities are highlighted. Our results confirm the phenomenon of Ge radial growth forming a Ge shell around the nanowire. Moreover, quantification is performed after verifying the absence of preferential sputtering of Si or Ge on a bulk SiGe sample. Hence, reliable results are obtained for heterostructured NW diameters higher than 100 nm. However, for smaller sizes, we have noticed that the sensitivity factors evaluated from bulk samples cannot be used because of edge effects occurring for highly topographical features and a modified contribution of backscattered electrons. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr07503j

  19. Quantification of training in competitive sports. Methods and applications.

    PubMed

    Hopkins, W G

    1991-09-01

    The training of competitive athletes can be assessed by retrospective questionnaires, diaries, physiological monitoring and direct observation of training behaviour. Questionnaires represent the most economical, most comprehensive and least accurate method. Diaries are more valid, but their drawbacks for long term quantitative studies are poor compliance and difficulties in processing the data they generate. Physiological monitoring (of oxygen consumption, heart rate or blood lactate concentration) provides objective measures of training intensity, and direct observation gives valid measures of most aspects of training; however, these methods are impractical for continuous, long term use. Coaches and athletes quantify training for purposes of motivation, systematisation of training and training prescription, but there has been little study of the use of training quantification by these practitioners. Motivation and systematisation are probably achieved best with diaries. Direct observation appears to be the best method of ensuring compliance with a training prescription, although heart rate monitoring is also a promising method for prescribing endurance training intensity. Sport scientists quantify training to study its effects on the performance and health status of competitive athletes. Most studies have been descriptive rather than experimental, and unvalidated questionnaires have been the predominant method of assaying training. The main areas of research include performance prediction and enhancement, overtraining, reproductive dysfunction, injury, illness, and nutritional status. Training has substantial effects in all of these areas. There is a need for more experimental studies that utilise validated measures of training to investigate how to reduce sports injuries and enhance competitive sports performance. More attention could also be given to methodological issues of training quantification.

  20. Respiratory Mucosal Proteome Quantification in Human Influenza Infections

    PubMed Central

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G.; DeVincenzo, John P.; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 28 and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection. PMID:27088501

  1. Respiratory Mucosal Proteome Quantification in Human Influenza Infections.

    PubMed

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G; DeVincenzo, John P; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 2(8) and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection.

  2. Quantification of neonatal amplitude-integrated EEG patterns.

    PubMed

    Thorngate, Lauren; Foreman, Shuyuann Wang; Thomas, Karen A

    2013-12-01

    Amplitude-integrated EEG (aEEG) is increasingly used in research with premature infants; however, comprehensive interpretation is limited by the lack of simple approaches for reliably quantifying and summarizing the data. Explore operational measures for quantifying continuity and discontinuity, measured by aEEG as components of infant brain function. An exploratory naturalistic study of neonates while in the Neonatal Intensive Care Unit (NICU). One single channel aEEG recording per infant was obtained without disruption of nursing care practices. 24 infants with mean postmenstrual age (PMA) of 33.11 weeks (SD 3.49), average age of 2.62 weeks (SD 1.35) and mean birth weights of 1.39 kg (SD 0.73). Quantification of continuity and discontinuity included bandwidth and lower border of aEEG, calculated proportion of time with signal amplitude below 10 μV, and peak counts. Variance of bandwidth and lower border denoted cycling. Group mean bandwidth was 52.98 μV (SD 27.62). Median peak count in 60 second epochs averaged 3.63 (SD 1.74), while median proportion < 10 μV was 22% (SD 0.20). The group mean of lower border within-subject aggregated medians was 6.20 μV (SD 2.13). Group mean lower border standard deviation was 3.96 μV. Proportion < 10 μV showed a strong negative correlation with the natural log of the lower border median (r = -0.906, p < .0001) after controlling for PMA. This study introduces a novel quantification process by counting peaks and proportion of time < 10 μV. Expanded definitions and analytic techniques will serve to strengthen the application of existing scoring systems for use in naturalistic research settings and clinical practice. © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Stochastic methods for uncertainty quantification in radiation transport

    SciTech Connect

    Fichtl, Erin D; Prinja, Anil K; Warsa, James S

    2009-01-01

    The use of generalized polynomial chaos (gPC) expansions is investigated for uncertainty quantification in radiation transport. The gPC represents second-order random processes in terms of an expansion of orthogonal polynomials of random variables and is used to represent the uncertain input(s) and unknown(s). We assume a single uncertain input-the total macroscopic cross section-although this does not represent a limitation of the approaches considered here. Two solution methods are examined: The Stochastic Finite Element Method (SFEM) and the Stochastic Collocation Method (SCM). The SFEM entails taking Galerkin projections onto the orthogonal basis, which, for fixed source problems, yields a linear system of fully -coupled equations for the PC coefficients of the unknown. For k-eigenvalue calculations, the SFEM system is non-linear and a Newton-Krylov method is employed to solve it. The SCM utilizes a suitable quadrature rule to compute the moments or PC coefficients of the unknown(s), thus the SCM solution involves a series of independent deterministic transport solutions. The accuracy and efficiency of the two methods are compared and contrasted. The PC coefficients are used to compute the moments and probability density functions of the unknown(s), which are shown to be accurate by comparing with Monte Carlo results. Our work demonstrates that stochastic spectral expansions are a viable alternative to sampling-based uncertainty quantification techniques since both provide a complete characterization of the distribution of the flux and the k-eigenvalue. Furthermore, it is demonstrated that, unlike perturbation methods, SFEM and SCM can handle large parameter uncertainty.

  4. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    PubMed

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Event-Specific Quantification of Radiation Belt Radial Diffusion

    NASA Astrophysics Data System (ADS)

    Tu, W.; Sarris, T. E.; Ozeke, L.

    2016-12-01

    Recently, there has been a great emphasis on developing event-specific inputs for radiation belt models, since they are proven critical for reproducing the observed radiation belt dynamics during strong events. For example, our DREAM3D simulation of the 8-9 October 2012 storm demonstrates that event-specific chorus wave model and seed population are critical to reproduce the strong enhancement of MeV electrons in this event. However, the observed fast electron dropout preceding the enhancement was not captured by the simulation, which could be due to the combined effects of fast outward radial diffusion of radiation belt electrons with magnetopause shadowing and enhanced electron precipitation. Without an event-specific quantification of radial diffusion, we cannot resolve the relative contribution of outward radial diffusion and precipitation to the observed electron dropout or realistically reproduce the dynamics during the event. In this work, we provide physical quantification of radial diffusion specific to the October 2012 event by including both real-time and global distributions of ULF waves from a constellation of wave measurements and event-specific estimation of ULF wave mode structure. The global maps of ULF waves during the event are constructed by combining the real-time measurements from the Van Allen Probes, THEMIS, and GOES satellites in space and a large array of ground magnetometers. The real-time ULF wave mode structure is then estimated using the new Cross-Wavelet Transform technique, applied to various azimuthally aligned pairs of ULF wave measurements that are located at the same L shells. The cross power and phase differences between the time series are calculated using the technique, based on which the wave power per mode number is estimated. Finally, the physically estimated radial diffusion coefficients specific to the event are applied to the DREAM3D model to quantify the relative contribution of radial diffusion to the electron dynamics

  6. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  7. Quantification of Amikacin in Bronchial Epithelial Lining Fluid in Neonates▿

    PubMed Central

    Tayman, C.; El-Attug, M. N.; Adams, E.; Van Schepdael, A.; Debeer, A.; Allegaert, K.; Smits, A.

    2011-01-01

    Amikacin efficacy is based on peak concentrations and the possibility of reaching therapeutic levels at the infection site. This study aimed to describe amikacin concentrations in the epithelial lining fluid (ELF) through bronchoalveolar lavage (BAL) in newborns. BAL fluid was collected in ventilated neonates treated with intravenous (i.v.) amikacin. Clinical characteristics, amikacin therapeutic drug monitoring serum concentrations, and the concentrations of urea in plasma were extracted from the individual patient files. Amikacin and urea BAL fluid concentrations were determined using liquid chromatography with pulsed electrochemical detection (LC-PED) and capillary electrophoresis with capacitively coupled contactless conductivity detection (CE-C4D), respectively. ELF amikacin concentrations were converted from BAL fluid concentrations through quantification of dilution (urea in plasma/urea in BAL fluid) during the BAL procedure. Twenty-two observations in 17 neonates (postmenstrual age, 31.9 [range, 25.1 to 41] weeks; postnatal age, 3.5 [range, 2 to 37] days) were collected. Median trough and peak amikacin serum concentrations were 2.1 (range, 1 to 7.1) mg/liter and 39.1 (range, 24.1 to 73.2) mg/liter; the median urea plasma concentration was 30 (8 to 90) mg/dl. The median amikacin concentration in ELF was 6.5 mg/liter, the minimum measured concentration was 1.5 mg/liter, and the maximum (peak) was 23 mg/liter. The highest measured ELF concentration was reached between 6 and 14.5 h after i.v. amikacin administration, and an estimated terminal elimination half-life was 8 to 10 h. The median and highest (peak) ELF amikacin concentrations observed in our study population were, respectively, 6.5 and 23 mg/liter. Despite the frequent use of amikacin in neonatal (pulmonary) infections, this is the first report of amikacin quantification in ELF in newborns. PMID:21709076

  8. Quantification of amikacin in bronchial epithelial lining fluid in neonates.

    PubMed

    Tayman, C; El-Attug, M N; Adams, E; Van Schepdael, A; Debeer, A; Allegaert, K; Smits, A

    2011-09-01

    Amikacin efficacy is based on peak concentrations and the possibility of reaching therapeutic levels at the infection site. This study aimed to describe amikacin concentrations in the epithelial lining fluid (ELF) through bronchoalveolar lavage (BAL) in newborns. BAL fluid was collected in ventilated neonates treated with intravenous (i.v.) amikacin. Clinical characteristics, amikacin therapeutic drug monitoring serum concentrations, and the concentrations of urea in plasma were extracted from the individual patient files. Amikacin and urea BAL fluid concentrations were determined using liquid chromatography with pulsed electrochemical detection (LC-PED) and capillary electrophoresis with capacitively coupled contactless conductivity detection (CE-C(4)D), respectively. ELF amikacin concentrations were converted from BAL fluid concentrations through quantification of dilution (urea in plasma/urea in BAL fluid) during the BAL procedure. Twenty-two observations in 17 neonates (postmenstrual age, 31.9 [range, 25.1 to 41] weeks; postnatal age, 3.5 [range, 2 to 37] days) were collected. Median trough and peak amikacin serum concentrations were 2.1 (range, 1 to 7.1) mg/liter and 39.1 (range, 24.1 to 73.2) mg/liter; the median urea plasma concentration was 30 (8 to 90) mg/dl. The median amikacin concentration in ELF was 6.5 mg/liter, the minimum measured concentration was 1.5 mg/liter, and the maximum (peak) was 23 mg/liter. The highest measured ELF concentration was reached between 6 and 14.5 h after i.v. amikacin administration, and an estimated terminal elimination half-life was 8 to 10 h. The median and highest (peak) ELF amikacin concentrations observed in our study population were, respectively, 6.5 and 23 mg/liter. Despite the frequent use of amikacin in neonatal (pulmonary) infections, this is the first report of amikacin quantification in ELF in newborns.

  9. Quantification of blood flow and topology in developing vascular networks.

    PubMed

    Kloosterman, Astrid; Hierck, Beerend; Westerweel, Jerry; Poelma, Christian

    2014-01-01

    Since fluid dynamics plays a critical role in vascular remodeling, quantification of the hemodynamics is crucial to gain more insight into this complex process. Better understanding of vascular development can improve prediction of the process, and may eventually even be used to influence the vascular structure. In this study, a methodology to quantify hemodynamics and network structure of developing vascular networks is described. The hemodynamic parameters and topology are derived from detailed local blood flow velocities, obtained by in vivo micro-PIV measurements. The use of such detailed flow measurements is shown to be essential, as blood vessels with a similar diameter can have a large variation in flow rate. Measurements are performed in the yolk sacs of seven chicken embryos at two developmental stages between HH 13+ and 17+. A large range of flow velocities (1 µm/s to 1 mm/s) is measured in blood vessels with diameters in the range of 25-500 µm. The quality of the data sets is investigated by verifying the flow balances in the branching points. This shows that the quality of the data sets of the seven embryos is comparable for all stages observed, and the data is suitable for further analysis with known accuracy. When comparing two subsequently characterized networks of the same embryo, vascular remodeling is observed in all seven networks. However, the character of remodeling in the seven embryos differs and can be non-intuitive, which confirms the necessity of quantification. To illustrate the potential of the data, we present a preliminary quantitative study of key network topology parameters and we compare these with theoretical design rules.

  10. Directional biases in phylogenetic structure quantification: a Mediterranean case study

    PubMed Central

    Molina-Venegas, Rafael; Roquet, Cristina

    2014-01-01

    Recent years have seen an increasing effort to incorporate phylogenetic hypotheses to the study of community assembly processes. The incorporation of such evolutionary information has been eased by the emergence of specialized software for the automatic estimation of partially resolved supertrees based on published phylogenies. Despite this growing interest in the use of phylogenies in ecological research, very few studies have attempted to quantify the potential biases related to the use of partially resolved phylogenies and to branch length accuracy, and no work has examined how tree shape may affect inference of community phylogenetic metrics. In this study, using a large plant community and elevational dataset, we tested the influence of phylogenetic resolution and branch length information on the quantification of phylogenetic structure; and also explored the impact of tree shape (stemminess) on the loss of accuracy in phylogenetic structure quantification due to phylogenetic resolution. For this purpose, we used 9 sets of phylogenetic hypotheses of varying resolution and branch lengths to calculate three indices of phylogenetic structure: the mean phylogenetic distance (NRI), the mean nearest taxon distance (NTI) and phylogenetic diversity (stdPD) metrics. The NRI metric was the less sensitive to phylogenetic resolution, stdPD showed an intermediate sensitivity, and NTI was the most sensitive one; NRI was also less sensitive to branch length accuracy than NTI and stdPD, the degree of sensitivity being strongly dependent on the dating method and the sample size. Directional biases were generally towards type II errors. Interestingly, we detected that tree shape influenced the accuracy loss derived from the lack of phylogenetic resolution, particularly for NRI and stdPD. We conclude that well-resolved molecular phylogenies with accurate branch length information are needed to identify the underlying phylogenetic structure of communities, and also that

  11. A Simple Optical Coherence Tomography Quantification Method for Choroidal Neovascularization

    PubMed Central

    Sulaiman, Rania S.; Quigley, Judith; Qi, Xiaoping; O'Hare, Michael N.; Grant, Maria B.; Boulton, Michael E.

    2015-01-01

    Abstract Purpose: Therapeutic efficacy is routinely assessed by measurement of lesion size using flatmounted choroids and confocal microscopy in the laser-induced choroidal neovascularization (L-CNV) rodent model. We investigated whether optical coherence tomography (OCT) quantification, using an ellipsoid volume measurement, was comparable to standard ex vivo evaluation methods for this model and whether this approach could be used to monitor treatment-related lesion changes. Methods: Bruch's membrane was ruptured by argon laser in the dilated eyes of C57BL/6J mice, followed by intravitreal injections of anti-VEGF164 or vehicle, or no injection. In vivo OCT images were acquired using Micron III or InVivoVue systems at 7, 10, and/or 14 days post-laser and neovascular lesion volume was calculated as an ellipsoid. Subsequently, lesion volume was compared to that calculated from confocal Z-stack images of agglutinin-stained choroidal flatmounts. Results: Ellipsoid volume measurement of orthogonal 2-dimensional OCT images obtained from different imaging systems correlated with ex vivo lesion volumes for L-CNV (Spearman's ρ=0.82, 0.75, and 0.82 at days 7, 10, and 14, respectively). Ellipsoid volume calculation allowed temporal monitoring and evaluation of CNV lesions in response to antivascular endothelial growth factor treatment. Conclusions: Ellipsoid volume measurements allow rapid, quantitative use of OCT for the assessment of CNV lesions in vivo. This novel method can be used with different OCT imaging systems with sensitivity to distinguish between treatment conditions. It may serve as a useful adjunct to the standard ex vivo confocal quantification, to assess therapeutic efficacy in preclinical models of CNV, and in models of other ocular diseases. PMID:26060878

  12. Detection and quantification of Bacillus cereus group in milk by droplet digital PCR.

    PubMed

    Porcellato, Davide; Narvhus, Judith; Skeie, Siv Borghild

    2016-08-01

    Droplet digital PCR (ddPCR) is one of the newest and most promising methods for the detection and quantification of molecular targets by PCR. Here, we optimized and used a new ddPCR assay for the detection and quantification of the Bacillus cereus group in milk. We also compared the ddPCR to a standard qPCR assay. The new ddPCR assay showed a similar coefficient of determination and a better limit of detection compared to the qPCR assay during quantification of the target molecules in the samples. However, the ddPCR assay has a limitation during quantification of a high number of target molecules. This new assay was then tested for the quantification of the B. cereus group in 90 milk samples obtained over three months from two different dairies and the milk was stored at different temperatures before sampling. The ddPCR assay showed good agreement with the qPCR assay for the quantification of the B. cereus group in milk, and due to its lower detection limit more samples were detected as positive. The new ddPCR assay is a promising method for the quantification of target bacteria in low concentration in milk. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Absolute protein quantification of the yeast chaperome under conditions of heat shock.

    PubMed

    Mackenzie, Rebecca J; Lawless, Craig; Holman, Stephen W; Lanthaler, Karin; Beynon, Robert J; Grant, Chris M; Hubbard, Simon J; Eyers, Claire E

    2016-08-01

    Chaperones are fundamental to regulating the heat shock response, mediating protein recovery from thermal-induced misfolding and aggregation. Using the QconCAT strategy and selected reaction monitoring (SRM) for absolute protein quantification, we have determined copy per cell values for 49 key chaperones in Saccharomyces cerevisiae under conditions of normal growth and heat shock. This work extends a previous chemostat quantification study by including up to five Q-peptides per protein to improve confidence in protein quantification. In contrast to the global proteome profile of S. cerevisiae in response to heat shock, which remains largely unchanged as determined by label-free quantification, many of the chaperones are upregulated with an average two-fold increase in protein abundance. Interestingly, eight of the significantly upregulated chaperones are direct gene targets of heat shock transcription factor-1. By performing absolute quantification of chaperones under heat stress for the first time, we were able to evaluate the individual protein-level response. Furthermore, this SRM data was used to calibrate label-free quantification values for the proteome in absolute terms, thus improving relative quantification between the two conditions. This study significantly enhances the largely transcriptomic data available in the field and illustrates a more nuanced response at the protein level. © 2016 The Authors. Proteomics Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Absolute protein quantification of the yeast chaperome under conditions of heat shock

    PubMed Central

    Mackenzie, Rebecca J.; Lawless, Craig; Holman, Stephen W.; Lanthaler, Karin; Beynon, Robert J.; Grant, Chris M.; Hubbard, Simon J.

    2016-01-01

    Chaperones are fundamental to regulating the heat shock response, mediating protein recovery from thermal‐induced misfolding and aggregation. Using the QconCAT strategy and selected reaction monitoring (SRM) for absolute protein quantification, we have determined copy per cell values for 49 key chaperones in Saccharomyces cerevisiae under conditions of normal growth and heat shock. This work extends a previous chemostat quantification study by including up to five Q‐peptides per protein to improve confidence in protein quantification. In contrast to the global proteome profile of S. cerevisiae in response to heat shock, which remains largely unchanged as determined by label‐free quantification, many of the chaperones are upregulated with an average two‐fold increase in protein abundance. Interestingly, eight of the significantly upregulated chaperones are direct gene targets of heat shock transcription factor‐1. By performing absolute quantification of chaperones under heat stress for the first time, we were able to evaluate the individual protein‐level response. Furthermore, this SRM data was used to calibrate label‐free quantification values for the proteome in absolute terms, thus improving relative quantification between the two conditions. This study significantly enhances the largely transcriptomic data available in the field and illustrates a more nuanced response at the protein level. PMID:27252046

  15. Integral quantification accuracy estimation for reporter ion-based quantitative proteomics (iQuARI).

    PubMed

    Vaudel, Marc; Burkhart, Julia M; Radau, Sonja; Zahedi, René P; Martens, Lennart; Sickmann, Albert

    2012-10-05

    With the increasing popularity of comparative studies of complex proteomes, reporter ion-based quantification methods such as iTRAQ and TMT have become commonplace in biological studies. Their appeal derives from simple multiplexing and quantification of several samples at reasonable cost. This advantage yet comes with a known shortcoming: precursors of different species can interfere, thus reducing the quantification accuracy. Recently, two methods were brought to the community alleviating the amount of interference via novel experimental design. Before considering setting up a new workflow, tuning the system, optimizing identification and quantification rates, etc. one legitimately asks: is it really worth the effort, time and money? The question is actually not easy to answer since the interference is heavily sample and system dependent. Moreover, there was to date no method allowing the inline estimation of error rates for reporter quantification. We therefore introduce a method called iQuARI to compute false discovery rates for reporter ion based quantification experiments as easily as Target/Decoy FDR for identification. With it, the scientist can accurately estimate the amount of interference in his sample on his system and eventually consider removing shadows subsequently, a task for which reporter ion quantification might not be the solution of choice.

  16. MRCQuant- an accurate LC-MS relative isotopic quantification algorithm on TOF instruments

    PubMed Central

    2011-01-01

    Background Relative isotope abundance quantification, which can be used for peptide identification and differential peptide quantification, plays an important role in liquid chromatography-mass spectrometry (LC-MS)-based proteomics. However, several major issues exist in the relative isotopic quantification of peptides on time-of-flight (TOF) instruments: LC peak boundary detection, thermal noise suppression, interference removal and mass drift correction. We propose to use the Maximum Ratio Combining (MRC) method to extract MS signal templates for interference detection/removal and LC peak boundary detection. In our method, MRCQuant, MS templates are extracted directly from experimental values, and the mass drift in each LC-MS run is automatically captured and compensated. We compared the quantification accuracy of MRCQuant to that of another representative LC-MS quantification algorithm (msInspect) using datasets downloaded from a public data repository. Results MRCQuant showed significant improvement in the number of accurately quantified peptides. Conclusions MRCQuant effectively addresses major issues in the relative quantification of LC-MS-based proteomics data, and it provides improved performance in the quantification of low abundance peptides. PMID:21406110

  17. A new objective method for acquisition and quantification of reflex receptive fields.

    PubMed

    Jensen, Michael Brun; Manresa, José Biurrun; Andersen, Ole Kæseler

    2015-03-01

    The nociceptive withdrawal reflex (NWR) is a polysynaptic spinal reflex correlated with pain perception. Assessment of this objective physiological measure constitutes the core of existing methods for quantification of reflex receptive fields (RRFs), which however still suffer from a certain degree of subjective involvement. This article proposes a strictly objective methodology for RRF quantification based on automated identification of NWR thresholds (NWR-Ts). Nociceptive withdrawal reflex thresholds were determined for 10 individual stimulation sites using an interleaved up-down staircase method. Reflexes were detected from electromyography by evaluation of interval peak z scores and application of conduction velocity analysis. Reflex receptive field areas were quantified from interpolated mappings of NWR-Ts and compared with existing RRF quantifications. A total of 3 repeated measures were performed in 2 different sessions to evaluate the test-retest reliability of the various quantifications, using coefficients of repeatability (CRs) and hypothetical sample sizes. The novel quantifications based on identification of NWR-Ts showed a similar level of reliability within and between sessions, whereas existing quantifications all demonstrated worse between-session than within-session reliability. The NWR-T-based quantifications required a smaller sample size than any of the existing RRF measures to detect a clinically relevant effect in a crossover study design involving more than 1 session. Of all measures, quantification from mapping of inversed NWR-Ts demonstrated superior reliability both within (CR, 0.25) and between sessions (CR, 0.28). The study presents a more reliable and robust quantification of the RRF to be used as biomarker of pain hypersensitivity in clinical and experimental research.

  18. Label-free relative quantification method for low-abundance glycoproteins in human serum by micrOTOF-Q.

    PubMed

    Hao, Piliang; Ren, Yan; Xie, Yongming

    2009-06-01

    In this study, a label-free relative quantification strategy was developed for quantifying low-abundance glycoproteins in human serum. It included three steps: (1) immunodepletion of 12 high-abundance proteins, (2) enrichment of low-abundance glycoproteins by multi-lectin column, (3) relative quantification of them between different samples by micrOTOF-Q. We also evaluated the specificity and efficiency of immunodepletion, the accuracy of protein quantification and the possible influence of immunodepletion, glycoprotein enrichment, trypsin digestion and peptide ionization on quantification. In conclusion, the relative quantification method can be effectively applied to the screening of low-abundance biomarkers.

  19. Objective Morphological Quantification of Microscopic Images Using a Fast Fourier Transform (FFT) Analysis.

    PubMed

    Taylor, Samuel E; Cao, Tuoxin; Talauliker, Pooja M; Lifshitz, Jonathan

    Quantification of immunohistochemistry (IHC) and immunofluorescence (IF) using image intensity depends on a number of variables. These variables add a subjective complexity in keeping a standard within and between laboratories. Fast Fourier Transformation (FFT) algorithms, however, allow for a rapid and objective quantification (via statistical analysis) using cell morphologies when the microscopic structures are oriented or aligned. Quantification of alignment is given in terms of a ratio of FFT intensity to the intensity of an orthogonal angle, giving a numerical value of the alignment of the microscopic structures. This allows for a more objective analysis than alternative approaches, which rely upon relative intensities.

  20. Objective Morphological Quantification of Microscopic Images Using a Fast Fourier Transform (FFT) Analysis

    PubMed Central

    Taylor, Samuel E.; Cao, Tuoxin; Talauliker, Pooja M.; Lifshitz, Jonathan

    2016-01-01

    Quantification of immunohistochemistry (IHC) and immunofluorescence (IF) using image intensity depends on a number of variables. These variables add a subjective complexity in keeping a standard within and between laboratories. Fast Fourier Transformation (FFT) algorithms, however, allow for a rapid and objective quantification (via statistical analysis) using cell morphologies when the microscopic structures are oriented or aligned. Quantification of alignment is given in terms of a ratio of FFT intensity to the intensity of an orthogonal angle, giving a numerical value of the alignment of the microscopic structures. This allows for a more objective analysis than alternative approaches, which rely upon relative intensities. PMID:27134700

  1. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches.

  2. Comparison of GC and HPLC for the quantification of organic acids in coffee.

    PubMed

    Jham, Gulab N; Fernandes, Sergio A; Garcia, Cleverson Fernando; da Silva, Alexsandro Araujo

    2002-01-01

    A GC and an HPLC method for the quantification of organic acids OAs in coffee have been compared. The GC procedure, employing trimethylsilyl derivatives, was found to be very tedious. The HPLC method, which employed an ion exchange column using a flow gradient of water containing 1% phosphoric acid and UV detection (210 nm), was found to be much simpler for the quantification of eight organic acids (oxalic, succinic, fumaric, malic, tartaric, citric, quinic and fumaric acids) in four representative coffee samples. The HPLC procedure was more convenient than that described in the literature since no pre-purification was required for quantification of the OAs.

  3. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    NASA Astrophysics Data System (ADS)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  4. Development of hydrate risk quantification in oil and gas production

    NASA Astrophysics Data System (ADS)

    Chaudhari, Piyush N.

    order to reduce the parametric study that may require a long duration of time using The Colorado School of Mines Hydrate Kinetic Model (CSMHyK). The evolution of the hydrate plugging risk along flowline-riser systems is modeled for steady state and transient operations considering the effect of several critical parameters such as oil-hydrate slip, duration of shut-in, and water droplet size on a subsea tieback system. This research presents a novel platform for quantification of the hydrate plugging risk, which in-turn will play an important role in improving and optimizing current hydrate management strategies. The predictive strength of the hydrate risk quantification and hydrate prediction models will have a significant impact on flow assurance engineering and design with respect to building safe and efficient hydrate management techniques for future deep-water developments.

  5. CURVATURE EFFECT QUANTIFICATION FOR IN-VIVO IR THERMOGRAPHY

    PubMed Central

    Cheng, Tze-Yuan; Deng, Daxiang; Herman, Cila

    2013-01-01

    Medical Infrared (IR) Imaging has become an important diagnostic tool over recent years. However, one underlying problem in medical diagnostics is associated with accurate quantification of body surface temperatures. This problem is caused by the artifacts induced by the curvature of objects, which leads to inaccurate temperature mapping and biased diagnostic results. Therefore, in our study, an experiment-based analysis is conducted to address the curvature effects toward the 3D temperature reconstruction of the IR thermography image. For quantification purposes, an isothermal copper plate with flat surface, and a cylindrical metal container filled with water are imaged. For the flat surface, the tilting angle measured from camera axis was varied incrementally from 0° to 60 °, such that the effects of surface viewing angle and travel distance on the measured temperature can be explored. On the cylindrical curved surface, the points viewed from 0° to 90° with respect to the camera axis are simultaneously imaged at different temperature levels. The experimental data obtained for the flat surface indicate that both viewing angle and distance effects become noticeable for angles over 40 °. The travel distance contributes a minor change when compared with viewing angle. The experimental results from the curved surface indicate that the curvature effect becomes pronounced when the viewing angle is larger than 60 °. The measurement error on the curved surface is compared with the simulation using the non-dielectric model, and the normalized temperature difference relative to 0° viewing angle was analyzed at six temperature levels. These results indicate that the linear formula associated with directional emissivity is a reasonable approximation for the measurement error, and the normalized error curves change consistently with viewing angle at various temperatures. Therefore, the analysis in this study implies that the directional emissivity based on the non

  6. Coral Pigments: Quantification Using HPLC and Detection by Remote Sensing

    NASA Technical Reports Server (NTRS)

    Cottone, Mary C.

    1995-01-01

    Widespread coral bleaching (loss of pigments of symbiotic dinoflagellates), and the corresponding decline in coral reef health worldwide, mandates the monitoring of coral pigmentation. Samples of the corals Porites compressa and P. lobata were collected from a healthy reef at Puako, Hawaii, and chlorophyll (chl) a, peridinin, and Beta-carotene (Beta-car) were quantified using reverse-phase high performance liquid chromatography (HPLC). Detailed procedures are presented for the extraction of the coral pigments in 90% acetone, and the separation, identification, and quantification of the major zooxanthellar pigments using spectrophotometry and a modification of the HPLC system described by Mantoura and Llewellyn (1983). Beta-apo-8-carotenal was found to be inadequate as in internal standard, due to coelution with chl b and/or chl a allomer in the sample extracts. Improvements are suggested, which may result in better resolution of the major pigments and greater accuracy in quantification. Average concentrations of peridinin, chl a, and Beta-car in corals on the reef were 5.01, 8.59, and 0.29, micro-grams/cm(exp 2), respectively. Average concentrations of peridinin and Beta-car did not differ significantly between the two coral species sampled; however, the mean chl a concentration in P. compressa specimens (7.81 ,micro-grams/cm(exp 2) was significantly lower than that in P. lobata specimens (9.96 11g/cm2). Chl a concentrations determined spectrophotometrically were significantly higher than those generated through HPLC, suggesting that spectrophotometry overestimates chl a concentrations. The average ratio of chl a-to-peridinin concentrations was 1.90, with a large (53%) coefficient of variation and a significant difference between the two species sampled. Additional data are needed before conclusions can be drawn regarding average pigment concentrations in healthy corals and the consistency of the chl a/peridinin ratio. The HPLC pigment concentration values

  7. Satellite Re-entry Modeling and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Horsley, M.

    2012-09-01

    LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty

  8. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  9. Modeling transport phenomena and uncertainty quantification in solidification processes

    NASA Astrophysics Data System (ADS)

    Fezi, Kyle S.

    Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification

  10. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    NASA Astrophysics Data System (ADS)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for

  11. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  12. Best practices for metabolite quantification in drug development: updated recommendation from the European Bioanalysis Forum.

    PubMed

    Timmerman, Philip; Blech, Stefan; White, Stephen; Green, Martha; Delatour, Claude; McDougall, Stuart; Mannens, Geert; Smeraglia, John; Williams, Stephen; Young, Graeme

    2016-06-01

    Metabolite quantification and profiling continues to grow in importance in today's drug development. The guidance provided by the 2008 FDA Metabolites in Safety Testing Guidance and the subsequent ICH M3(R2) Guidance (2009) has led to a more streamlined process to assess metabolite exposures in preclinical and clinical studies in industry. In addition, the European Bioanalysis Forum (EBF) identified an opportunity to refine the strategies on metabolite quantification considering the experience to date with their recommendation paper on the subject dating from 2010 and integrating the recent discussions on the tiered approach to bioanalytical method validation with focus on metabolite quantification. The current manuscript summarizes the discussion and recommendations from a recent EBF Focus Workshop into an updated recommendation for metabolite quantification in drug development.

  13. Quantification of hesperidin in citrus-based foods using a fungal diglycosidase.

    PubMed

    Mazzaferro, Laura S; Breccia, Javier D

    2012-10-15

    A simple enzymatic-spectrophotometric method for hesperidin quantification was developed by means of a specific fungal enzyme. The method utilises the diglycosidase α-rhamnosyl-β-glucosidase (EC 3.2.1.168) to quantitatively hydrolyse hesperidin to hesperetin, and the last is measured by its intrinsic absorbance in the UV range at 323 nm. The application of this method to quantify hesperidin in orange (Citrus sinensis) juices was shown to be reliable in comparison with the standard method for flavonoid quantification (high performance liquid chromatography, HPLC). The enzymatic method was found to have a limit of quantification of 1.8 μM (1.1 mg/L) hesperidin, similar to the limit usually achieved by HPLC. Moreover, it was feasible to be applied to raw juice, without sample extraction. This feature eliminated the sample pre-treatment, which is mandatory for HPLC, with the consequent reduction of the time required for the quantification.

  14. Monte Carlo Simulation for Quantification of Light Transport Features in Apples

    USDA-ARS?s Scientific Manuscript database

    Light interaction with turbid biological materials involves absorption and scattering. Quantitative understanding of light propagation features in the fruit is critical to designing better optical systems for inspection of food quality. This article reports on the quantification of light propagation...

  15. Quantification of Rare Single-Molecule Species Based on Fluorescence Lifetime.

    PubMed

    Liu, Cong; Rastogi, Ajay; Yeh, Hsin-Chih

    2017-04-11

    Single-molecule tracking combined with fluorescence lifetime analysis can be a powerful tool for direct molecular quantification in solution. However, it is not clear what molecular identification accuracy and number of single-molecule tracks are required to achieve a precise quantification of rare molecular species. Here we carry out this calculation based on experimentally obtained single-molecule lifetime data and an unbiased ratio estimator. Our results indicate that even at the molecular identification accuracy of 0.99999, 1.8 million tracks are still required in order to achieve 95% confidence level in rare species quantification with relative error less than ±5%. Our work highlights the fundamental challenges that we are facing in precise single-molecule identification and quantification without amplification.

  16. Support vector machine learning-based cerebral blood flow quantification for arterial spin labeling MRI.

    PubMed

    Wang, Ze

    2014-07-01

    To develop a multivariate machine learning classification-based cerebral blood flow (CBF) quantification method for arterial spin labeling (ASL) perfusion MRI. The label and control images of ASL MRI were separated using a machine-learning algorithm, the support vector machine (SVM). The perfusion-weighted image was subsequently extracted from the multivariate (all voxels) SVM classifier. Using the same pre-processing steps, the proposed method was compared with standard ASL CBF quantification method using synthetic data and in-vivo ASL images. As compared with the conventional univariate approach, the proposed ASL CBF quantification method significantly improved spatial signal-to-noise-ratio (SNR) and image appearance of ASL CBF images. the multivariate machine learning-based classification is useful for ASL CBF quantification. Copyright © 2014 Wiley Periodicals, Inc.

  17. Phase identification and quantification in a devitrified glass using homo- and heteronuclear solid-state NMR.

    PubMed

    Tricot, Grégory; Delevoye, Laurent; Palavit, Gérard; Montagne, Lionel

    2005-11-14

    A complex mixture resulting from the devitrification of an aluminophosphate glass has been studied for the first time using a combination of homo- and heteronuclear solid-state NMR sequences that offers the advantage of subsequent quantification.

  18. Efficient capture and simple quantification of circulating tumor cells using quantum dots and magnetic beads.

    PubMed

    Min, Hyegeun; Jo, Seong-Min; Kim, Hak-Sung

    2015-06-03

    Circulating tumor cells (CTCs) are valuable biomarkers for monitoring the status of cancer patients and drug efficacy. However, the number of CTCs in the blood is extremely low, and the isolation and detection of CTCs with high efficiency and sensitivity remain a challenge. Here, we present an approach to the efficient capturing and simple quantification of CTCs using quantum dots and magnetic beads. Anti-EpCAM antibody-conjugated quantum dots are used for the targeting and quantification of CTCs, and quantum-dot-attached CTCs are isolated using anti-IgG-modified magnetic beads. Our approach is shown to result in a capture efficiency of about 70%-80%, enabling the simple quantification of captured CTCs based on the fluorescence intensity of the quantum dots. The present method can be used effectively in the capturing and simple quantification of CTCs with high efficiency for cancer diagnosis and monitoring.

  19. A flow quantification method using fluid dynamics regularization and MR tagging.

    PubMed

    Jiraraksopakun, Yuttapong; McDougall, Mary P; Wright, Steven M; Ji, Jim X

    2010-06-01

    This paper presents a new method for improved flow analysis and quantification using MRI. The method incorporates fluid dynamics to regularize the flow quantification from tagged MR images. Specifically, the flow quantification is formulated as a minimization problem based on the following: 1) the Navier-Stokes equation governing the fluid dynamics; 2) the flow continuity equation and boundary conditions; and 3) the data consistency constraint. The minimization is carried out using a genetic algorithm. This method is tested using both computer simulations and MR flow experiments. The results are evaluated using flow vector fields from the computational fluid dynamics software package as a reference, which show that the new method can achieve more realistic and accurate flow quantifications than the conventional method.

  20. Survey of Quantification and Distance Functions Used for Internet-based Weak-link Sociological Phenomena

    DTIC Science & Technology

    2016-03-01

    Distribution Unlimited Final Report on "Survey of Quantification and Distance Functions Used for Internet -based Weak-link Sociological Phenomena...journals: Number of Papers published in non peer-reviewed journals: Final Report on "Survey of Quantification and Distance Functions Used for Internet ...improve the efficiency of the computation in hoping to speed up the search. Due to the explosive increase of the webpages and the Internet surfers

  1. Recurrence quantification as potential bio-markers for diagnosis of pre-cancer

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sabyasachi; Pratiher, Sawon; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2017-03-01

    In this paper, the spectroscopy signals have been analyzed in recurrence plots (RP), and extract recurrence quantification analysis (RQA) parameters from the RP in order to classify the tissues into normal and different precancerous grades. Three RQA parameters have been quantified in order to extract the important features in the spectroscopy data. These features have been fed to different classifiers for classification. Simulation results validate the efficacy of the recurrence quantification as potential bio-markers for diagnosis of pre-cancer.

  2. Preparation, imaging, and quantification of bacterial surface motility assays.

    PubMed

    Morales-Soto, Nydia; Anyan, Morgen E; Mattingly, Anne E; Madukoma, Chinedu S; Harvey, Cameron W; Alber, Mark; Déziel, Eric; Kearns, Daniel B; Shrout, Joshua D

    2015-04-07

    Bacterial surface motility, such as swarming, is commonly examined in the laboratory using plate assays that necessitate specific concentrations of agar and sometimes inclusion of specific nutrients in the growth medium. The preparation of such explicit media and surface growth conditions serves to provide the favorable conditions that allow not just bacterial growth but coordinated motility of bacteria over these surfaces within thin liquid films. Reproducibility of swarm plate and other surface motility plate assays can be a major challenge. Especially for more "temperate swarmers" that exhibit motility only within agar ranges of 0.4%-0.8% (wt/vol), minor changes in protocol or laboratory environment can greatly influence swarm assay results. "Wettability", or water content at the liquid-solid-air interface of these plate assays, is often a key variable to be controlled. An additional challenge in assessing swarming is how to quantify observed differences between any two (or more) experiments. Here we detail a versatile two-phase protocol to prepare and image swarm assays. We include guidelines to circumvent the challenges commonly associated with swarm assay media preparation and quantification of data from these assays. We specifically demonstrate our method using bacteria that express fluorescent or bioluminescent genetic reporters like green fluorescent protein (GFP), luciferase (lux operon), or cellular stains to enable time-lapse optical imaging. We further demonstrate the ability of our method to track competing swarming species in the same experiment.

  3. Interactive image quantification tools in nuclear material forensics

    NASA Astrophysics Data System (ADS)

    Porter, Reid; Ruggiero, Christy; Hush, Don; Harvey, Neal; Kelly, Patrick; Scoggins, Wayne; Tandon, Lav

    2011-03-01

    Morphological and microstructural features visible in microscopy images of nuclear materials can give information about the processing history of a nuclear material. Extraction of these attributes currently requires a subject matter expert in both microscopy and nuclear material production processes, and is a time consuming, and at least partially manual task, often involving multiple software applications. One of the primary goals of computer vision is to find ways to extract and encode domain knowledge associated with imagery so that parts of this process can be automated. In this paper we describe a user-in-the-loop approach to the problem which attempts to both improve the efficiency of domain experts during image quantification as well as capture their domain knowledge over time. This is accomplished through a sophisticated user-monitoring system that accumulates user-computer interactions as users exploit their imagery. We provide a detailed discussion of the interactive feature extraction and segmentation tools we have developed and describe our initial results in exploiting the recorded user-computer interactions to improve user productivity over time.

  4. VESGEN Software for Mapping and Quantification of Vascular Regulators

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.

    2012-01-01

    VESsel GENeration (VESGEN) Analysis is an automated software that maps and quantifies effects of vascular regulators on vascular morphology by analyzing important vessel parameters. Quantification parameters include vessel diameter, length, branch points, density, and fractal dimension. For vascular trees, measurements are reported as dependent functions of vessel branching generation. VESGEN maps and quantifies vascular morphological events according to fractal-based vascular branching generation. It also relies on careful imaging of branching and networked vascular form. It was developed as a plug-in for ImageJ (National Institutes of Health, USA). VESGEN uses image-processing concepts of 8-neighbor pixel connectivity, skeleton, and distance map to analyze 2D, black-and-white (binary) images of vascular trees, networks, and tree-network composites. VESGEN maps typically 5 to 12 (or more) generations of vascular branching, starting from a single parent vessel. These generations are tracked and measured for critical vascular parameters that include vessel diameter, length, density and number, and tortuosity per branching generation. The effects of vascular therapeutics and regulators on vascular morphology and branching tested in human clinical or laboratory animal experimental studies are quantified by comparing vascular parameters with control groups. VESGEN provides a user interface to both guide and allow control over the users vascular analysis process. An option is provided to select a morphological tissue type of vascular trees, network or tree-network composites, which determines the general collections of algorithms, intermediate images, and output images and measurements that will be produced.

  5. Mesh refinement for uncertainty quantification through model reduction

    SciTech Connect

    Li, Jing Stinis, Panos

    2015-01-01

    We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan–Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory.

  6. Characterisation and quantification of regional diurnal SST cycles from SEVIRI

    NASA Astrophysics Data System (ADS)

    Karagali, I.; Høyer, J. L.

    2014-09-01

    Hourly SST (sea surface temperature) fields from the geostationary Spinning Enhanced Visible and Infrared Imager (SEVIRI) offer a unique opportunity for the characterisation and quantification of the diurnal cycle of SST in the Atlantic Ocean, the Mediterranean Sea and the northern European shelf seas. Six years of SST fields from SEVIRI are validated against the Advanced Along-Track Scanning Radiometer (AATSR) Reprocessed for Climate (ARC) data set. The overall SEVIRI-AATSR bias is -0.07 K, and the standard deviation is 0.51 K, based on more than 53 × 106 match-ups. Identification of the diurnal signal requires an SST foundation temperature field representative of well-mixed conditions which typically occur at night-time or under moderate and strong winds. Such fields are generated from the SEVIRI archive and are validated against pre-dawn SEVIRI SSTs and night-time SSTs from drifting buoys. The different methodologies tested for the foundation temperature fields reveal variability introduced by averaging night-time SSTs over many days compared to single-day, pre-dawn values. Diurnal warming is most pronounced in the Mediterranean and Baltic seas while weaker diurnal signals are found in the tropics. Longer diurnal warming duration is identified in the high latitudes compared to the tropics. The maximum monthly mean diurnal signal can be up to 0.5 K in specific regions.

  7. Characterisation and quantification of regional diurnal SST cycles from SEVIRI

    NASA Astrophysics Data System (ADS)

    Karagali, I.; Høyer, J. L.

    2014-04-01

    Hourly SST fields from the geostationary Spinning Enhanced Visible and Infrared Imager (SEVIRI) offer a unique opportunity for the characterisation and quantification of the diurnal cycle of SST in the Atlantic Ocean, the Mediterranean Sea and the Northern European Shelf seas. Six years of SST fields from the SEVIRI dataset are validated against the polar orbiting Advanced Along Track Scanning Radiometer (AATSR) archive to identify biases in the SEVIRI data. Identification of the diurnal signal requires a night-time SST field representative of foundation temperatures, i.e. well-mixed conditions and free of any diurnal signal. Such fields are generated from the SEVIRI archive and are validated against pre-dawn SEVIRI SSTs and night-time SSTs from drifting buoys. The overall SEVIRI-AATSR bias is -0.07 K, and the standard deviation is 0.51 K, based on more than 53 × 106 match-ups. The different methodologies tested for the foundation temperature fields reveal variability introduced by averaging night-time SSTs over many days compared to single-day, pre-dawn values. Diurnal warming is most pronounced in the Mediterranean and Baltic Seas while smallest diurnal signals are found in the Tropics. Longer diurnal warming duration is identified in the high latitudes compared to the Tropics. The mean diurnal signal of monthly mean SST can be up to 0.5° in specific regions.

  8. Quantification of human upper extremity nerves and fascicular anatomy.

    PubMed

    Brill, Natalie A; Tyler, Dustin J

    2017-09-01

    In this study we provide detailed quantification of upper extremity nerve and fascicular anatomy. The purpose is to provide values and trends in neural features useful for clinical applications and neural interface device design. Nerve cross-sections were taken from 4 ulnar, 4 median, and 3 radial nerves from 5 arms of 3 human cadavers. Quantified nerve features included cross-sectional area, minor diameter, and major diameter. Fascicular features analyzed included count, perimeter, area, and position. Mean fascicular diameters were 0.57 ± 0.39, 0.6 ± 0.3, 0.5 ± 0.26 mm in the upper arm and 0.38 ± 0.18, 0.47 ± 0.18, 0.4 ± 0.27 mm in the forearm of ulnar, median, and radial nerves, respectively. Mean fascicular diameters were inversely proportional to fascicle count. Detailed quantitative anatomy of upper extremity nerves is a resource for design of neural electrodes, guidance in extraneural procedures, and improved neurosurgical planning. Muscle Nerve 56: 463-471, 2017. © 2016 Wiley Periodicals, Inc.

  9. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    PubMed Central

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  10. Uncertainty quantification in nanomechanical measurements using the atomic force microscope.

    PubMed

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-11

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  11. Non-intrusive uncertainty quantification using reduced cubature rules

    NASA Astrophysics Data System (ADS)

    van den Bos, L. M. M.; Koren, B.; Dwight, R. P.

    2017-03-01

    For the purpose of uncertainty quantification with collocation, a method is proposed for generating families of one-dimensional nested quadrature rules with positive weights and symmetric nodes. This is achieved through a reduction procedure: we start with a high-degree quadrature rule with positive weights and remove nodes while preserving symmetry and positivity. This is shown to be always possible, by a lemma depending primarily on Carathéodory's theorem. The resulting one-dimensional rules can be used within a Smolyak procedure to produce sparse multi-dimensional rules, but weight positivity is lost then. As a remedy, the reduction procedure is directly applied to multi-dimensional tensor-product cubature rules. This allows to produce a family of sparse cubature rules with positive weights, competitive with Smolyak rules. Finally the positivity constraint is relaxed to allow more flexibility in the removal of nodes. This gives a second family of sparse cubature rules, in which iteratively as many nodes as possible are removed. The new quadrature and cubature rules are applied to test problems from mathematics and fluid dynamics. Their performance is compared with that of the tensor-product and standard Clenshaw-Curtis Smolyak cubature rule.

  12. Quantification of hydroxyacetone and glycolaldehyde using chemical ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    St. Clair, J. M.; Spencer, K. M.; Beaver, M. R.; Crounse, J. D.; Paulot, F.; Wennberg, P. O.

    2014-04-01

    Chemical ionization mass spectrometry (CIMS) enables online, rapid, in situ detection and quantification of hydroxyacetone and glycolaldehyde. Two different CIMS approaches are demonstrated employing the strengths of single quadrupole mass spectrometry and triple quadrupole (tandem) mass spectrometry. Both methods are generally capable of the measurement of hydroxyacetone, an analyte with known but minimal isobaric interferences. Tandem mass spectrometry provides direct separation of the isobaric compounds glycolaldehyde and acetic acid using distinct, collision-induced dissociation daughter ions. The single quadrupole CIMS measurement of glycolaldehyde was demonstrated during the ARCTAS-CARB (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites - California Air Resources Board) 2008 campaign, while triple quadrupole CIMS measurements of glycolaldehyde and hydroxyacetone were demonstrated during the BEARPEX (Biosphere Effects on Aerosols and Photochemistry Experiment) 2009 campaign. Enhancement ratios of glycolaldehyde in ambient biomass-burning plumes are reported for the ARCTAS-CARB campaign. BEARPEX observations are compared to simple photochemical box model predictions of biogenic volatile organic compound oxidation at the site.

  13. An Uncertainty Quantification System for Tabular Equations of State

    NASA Astrophysics Data System (ADS)

    Carpenter, John; Robinson, Allen; Debusschere, Bert; Mattsson, Ann; Drake, Richard; Rider, William

    2013-06-01

    Providing analysts with information regarding the accuracy of computational models is key for enabling predictive design and engineering. Uncertainty in material models can make significant contributions to the overall uncertainty in calculations. As a first step toward tackling this large problem, we present an uncertainty quantification system for tabular equations of state (EOS). First a posterior distribution of EOS model parameters is inferred using Bayes rule and a set of experimental and computational data. EOS tables are generated for parameter states sampled from the posterior distribution. A new unstructured triangular table format allows for capturing multi-phase model behavior. A principal component analysis then reduces this set of tables to a mean table and most significant perturbations. This final set of tables is provided to hydrocodes for performing simulations using standard non-intrusive uncertainty propagation methods. A multi-phase aluminum model is used to demonstrate the system. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  14. Micelle mediated trace level sulfide quantification through cloud point extraction.

    PubMed

    Devaramani, Samrat; Malingappa, Pandurangappa

    2012-01-01

    A simple cloud point extraction protocol has been proposed for the quantification of sulfide at trace level. The method is based on the reduction of iron (III) to iron (II) by the sulfide and the subsequent complexation of metal ion with nitroso-R salt in alkaline medium. The resulting green-colored complex was extracted through cloud point formation using cationic surfactant, that is, cetylpyridinium chloride, and the obtained surfactant phase was homogenized by ethanol before its absorbance measurement at 710 nm. The reaction variables like metal ion, ligand, surfactant concentration, and medium pH on the cloud point extraction of the metal-ligand complex have been optimized. The interference effect of the common anions and cations was studied. The proposed method has been successfully applied to quantify the trace level sulfide in the leachate samples of the landfill and water samples from bore wells and ponds. The validity of the proposed method has been studied by spiking the samples with known quantities of sulfide as well as comparing with the results obtained by the standard method.

  15. Quantification of intracellular payload release from polymersome nanoparticles

    NASA Astrophysics Data System (ADS)

    Scarpa, Edoardo; Bailey, Joanne L.; Janeczek, Agnieszka A.; Stumpf, Patrick S.; Johnston, Alexander H.; Oreffo, Richard O. C.; Woo, Yin L.; Cheong, Ying C.; Evans, Nicholas D.; Newman, Tracey A.

    2016-07-01

    Polymersome nanoparticles (PMs) are attractive candidates for spatio-temporal controlled delivery of therapeutic agents. Although many studies have addressed cellular uptake of solid nanoparticles, there is very little data available on intracellular release of molecules encapsulated in membranous carriers, such as polymersomes. Here, we addressed this by developing a quantitative assay based on the hydrophilic dye, fluorescein. Fluorescein was encapsulated stably in PMs of mean diameter 85 nm, with minimal leakage after sustained dialysis. No fluorescence was detectable from fluorescein PMs, indicating quenching. Following incubation of L929 cells with fluorescein PMs, there was a gradual increase in intracellular fluorescence, indicating PM disruption and cytosolic release of fluorescein. By combining absorbance measurements with flow cytometry, we quantified the real-time intracellular release of a fluorescein at a single-cell resolution. We found that 173 ± 38 polymersomes released their payload per cell, with significant heterogeneity in uptake, despite controlled synchronisation of cell cycle. This novel method for quantification of the release of compounds from nanoparticles provides fundamental information on cellular uptake of nanoparticle-encapsulated compounds. It also illustrates the stochastic nature of population distribution in homogeneous cell populations, a factor that must be taken into account in clinical use of this technology.

  16. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  17. Quantification of thyroid volume using 3-D ultrasound imaging.

    PubMed

    Kollorz, E K; Hahn, D A; Linke, R; Goecke, T W; Hornegger, J; Kuwert, T

    2008-04-01

    Ultrasound (US) is among the most popular diagnostic techniques today. It is non-invasive, fast, comparably cheap, and does not require ionizing radiation. US is commonly used to examine the size, and structure of the thyroid gland. In clinical routine, thyroid imaging is usually performed by means of 2-D US. Conventional approaches for measuring the volume of the thyroid gland or its nodules may therefore be inaccurate due to the lack of 3-D information. This work reports a semi-automatic segmentation approach for the classification, and analysis of the thyroid gland based on 3-D US data. The images are scanned in 3-D, pre-processed, and segmented. Several pre-processing methods, and an extension of a commonly used geodesic active contour level set formulation are discussed in detail. The results obtained by this approach are compared to manual interactive segmentations by a medical expert in five representative patients. Our work proposes a novel framework for the volumetric quantification of thyroid gland lobes, which may also be expanded to other parenchymatous organs.

  18. Quantification of acidic compounds in complex biomass-derived streams

    SciTech Connect

    Karp, Eric M.; Nimlos, Claire T.; Deutch, Steve; Salvachúa, Davinia; Cywar, Robin M.; Beckham, Gregg T.

    2016-01-01

    Biomass-derived streams that contain acidic compounds from the degradation of lignin and polysaccharides (e.g. black liquor, pyrolysis oil, pyrolytic lignin, etc.) are chemically complex solutions prone to instability and degradation during analysis, making quantification of compounds within them challenging. Here we present a robust analytical method to quantify acidic compounds in complex biomass-derived mixtures using ion exchange, sample reconstitution in pyridine and derivatization with BSTFA. The procedure is based on an earlier method originally reported for kraft black liquors and, in this work, is applied to identify and quantify a large slate of acidic compounds in corn stover derived alkaline pretreatment liquor (APL) as a function of pretreatment severity. Analysis of the samples is conducted with GCxGC-TOFMS to achieve good resolution of the components within the complex mixture. The results reveal the dominant low molecular weight components and their concentrations as a function of pretreatment severity. Application of this method is also demonstrated in the context of lignin conversion technologies by applying it to track the microbial conversion of an APL substrate. Here too excellent results are achieved, and the appearance and disappearance of compounds is observed in agreement with the known metabolic pathways of two bacteria, indicating the sample integrity was maintained throughout analysis. Finally, it is shown that this method applies more generally to lignin-rich materials by demonstrating its usefulness in analysis of pyrolysis oil and pyrolytic lignin.

  19. Accuracy and precision of radioactivity quantification in nuclear medicine images.

    PubMed

    Frey, Eric C; Humm, John L; Ljungberg, Michael

    2012-05-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Low order models for uncertainty quantification in acoustic propagation problems

    NASA Astrophysics Data System (ADS)

    Millet, Christophe

    2016-11-01

    Long-range sound propagation problems are characterized by both a large number of length scales and a large number of normal modes. In the atmosphere, these modes are confined within waveguides causing the sound to propagate through multiple paths to the receiver. For uncertain atmospheres, the modes are described as random variables. Concise mathematical models and analysis reveal fundamental limitations in classical projection techniques due to different manifestations of the fact that modes that carry small variance can have important effects on the large variance modes. In the present study, we propose a systematic strategy for obtaining statistically accurate low order models. The normal modes are sorted in decreasing Sobol indices using asymptotic expansions, and the relevant modes are extracted using a modified iterative Krylov-based method. The statistics of acoustic signals are computed by decomposing the original pulse into a truncated sum of modal pulses that can be described by a stationary phase method. As the low-order acoustic model preserves the overall structure of waveforms under perturbations of the atmosphere, it can be applied to uncertainty quantification. The result of this study is a new algorithm which applies on the entire phase space of acoustic fields.

  1. Uncertainty Quantification and Statistical Convergence Guidelines for PIV Data

    NASA Astrophysics Data System (ADS)

    Stegmeir, Matthew; Kassen, Dan

    2016-11-01

    As Particle Image Velocimetry has continued to mature, it has developed into a robust and flexible technique for velocimetry used by expert and non-expert users. While historical estimates of PIV accuracy have typically relied heavily on "rules of thumb" and analysis of idealized synthetic images, recently increased emphasis has been placed on better quantifying real-world PIV measurement uncertainty. Multiple techniques have been developed to provide per-vector instantaneous uncertainty estimates for PIV measurements. Often real-world experimental conditions introduce complications in collecting "optimal" data, and the effect of these conditions is important to consider when planning an experimental campaign. The current work utilizes the results of PIV Uncertainty Quantification techniques to develop a framework for PIV users to utilize estimated PIV confidence intervals to compute reliable data convergence criteria for optimal sampling of flow statistics. Results are compared using experimental and synthetic data, and recommended guidelines and procedures leveraging estimated PIV confidence intervals for efficient sampling for converged statistics are provided.

  2. Quantification of Rossolimo reflexes: a sensitive marker for spondylotic myelopathy.

    PubMed

    Chang, C-W; Chang, K-Y; Lin, S-M

    2011-02-01

    Prospective study. To assess and quantify Rossolimo reflexes using an electrophysiological test, and correlate the findings with the severity of spinal cord dysfunction in cervical and thoracic spondylotic myelopathy (CTSM). A university neurorehabilitation center. We enlisted 42 patients with CTSM between the fifth cervical and the ninth thoracic cord levels. Using electrophysiological assessments, Rossolimo reflexes were evaluated in all patients. Conduction latencies and amplitude of muscle action potentials (MAPs) of the reflexes were measured, analyzed and compared with the grading of spinal cord dysfunction and the cord compression ratios. We found a high diagnostic sensitivity of quantified Rossolimo reflex in patients with CTSM. A positive correlation exists between the MAP amplitude of Rossolimo reflexes and the different grades of spinal cord dysfunction. A negative linear relationship was found between the MAP amplitude of Rossolimo reflexes and the cord compression ratios in CTSM patients. Rossolimo reflexes can be measured by electrophysiological assessments, and we demonstrate a quantification method for an established neurological sign. Not only is the Rossolimo reflex found to be a highly sensitive test in clinical neurological examination but the electrophysiological assessment for this reflex can also serve as an objective marker for evaluation of the severity of spinal cord dysfunction in CTSM.

  3. Stochastic approach to the quantification of buried flexible flowline upheaval

    SciTech Connect

    Colquhoun, R.S.

    1994-12-31

    Previous papers have presented a simple method of predicting the cover depth required to prevent the upheaval of flexibles of given design subjected to given loadings. The present paper gives a brief resume of the method and then goes on to quantify the maximum upheaval that can occur if the cover is inadequate or if the vertical misalignment of the pipe is greater than expected. This quantification is governed by the axial coefficient of friction between the pipe and the sail and assumes that there is no other imperfection of equivalent magnitude within the length of sliding pipe associated with the primary imperfection at which the upheaval is taking place, ie: assumes that the primary upheaval is not alleviated by any secondary upheaval elsewhere. The paper then addresses the probability of a second imperfection close enough to alleviate the primary upheaval. It is concluded that in general this probability is not high enough to justify reliance on any alleviating effect; it may only be relied upon if the as-trenched vertical alignment of the pipe is consistently bad. Design guidelines are indicated.

  4. Immobilized Particle Imaging for Quantification of Nano- and Microparticles.

    PubMed

    Cui, Jiwei; Hibbs, Benjamin; Gunawan, Sylvia T; Braunger, Julia A; Chen, Xi; Richardson, Joseph J; Hanssen, Eric; Caruso, Frank

    2016-04-12

    The quantification of nano- and microparticles is critical for diverse applications relying on the exact knowledge of the particle concentration. Although many techniques are available for counting particles, there are some limitations in regards to counting with low-scattering materials and facile counting in harsh organic solvents. Herein, we introduce an easy and rapid particle counting technique, termed "immobilized particle imaging" (IPI), to quantify fluorescent particles with different compositions (i.e., inorganic or organic), structures (i.e., solid, porous, or hollow), and sizes (50-1000 nm) dispersed in either aqueous or organic solutions. IPI is achieved by immobilizing particles of interest in a cell matrix-like scaffold (e.g., agarose) and imaging using standard microscopy techniques. Imaging a defined volume of the immobilized particles allows for the particle concentration to be calculated from the count numbers in a fixed volume. IPI provides a general and facile approach to quantify advanced nano- and microparticles, which may be helpful to researchers to obtain new insights for different applications (e.g., nanomedicine).

  5. Quantification of sea ice production in Weddell Sea polynyas

    NASA Astrophysics Data System (ADS)

    Zentek, Rolf; Heinemann, Günther; Paul, Stephan; Stulic, Lukrecia; Timmermann, Ralph

    2017-04-01

    The regional climate model COSMO-CLM was used to perform simulations the Weddell Sea region in Antarctica for the time period 2002-2015 with the focus on atmosphere-ocean-sea ice interactions. The original model was adapted to polar regions by the use of a thermodynamic sea ice module with snow cover and an temperature-dependent albedo scheme for sea ice. The recently published topography RTopo2 was used. The model was run with nesting in ERA-Interim data in a forecast mode. Sea ice concentrations were taken from satellite measurements (AMSR-E, SSMI/S, AMSR2) and were updated daily to allow for a close-to-reality hindcast. Simulations were done with 15 km resolution for the whole period 2002-2015 with the goal to force the sea-ice ocean model FESOM. In a second step a 5 km simulation was one-way nested for the winter period (April - September) 2002-2015 to allow for a better quantification of sea ice production in the Weddell Sea. Estimates of sea ice production and comparisons of the results to remote sensing data will be presented.

  6. Improved HPLC method for total plasma homocysteine detection and quantification.

    PubMed

    Sawuła, Wojciech; Banecka-Majkutewicz, Zyta; Kadziński, Leszek; Jakóbkiewicz-Banecka, Joanna; Wegrzyn, Grzegorz; Nyka, Walenty; Banecki, Bogdan

    2008-01-01

    Recent clinical research has pointed at hyperhomocysteinemia as an independent risk factor in a number of cardiovascular and neurological diseases. We have improved a chromatographic method of total plasma homocysteine measurements in order to obtain higher sensitivity, reliability and reproducibility. The method demonstrates excellent linearity (R=0.999), range (<2-100 microM), precision (instrumental RSD 0.06 and method RSD 1.17), accuracy (recovery of 99.92 and RSD 1.27), reproducibility, quantification limit and ruggedness (e.g. pH from 2.0 to 2.5). Because even a small increase in homocysteine level can be a significant risk factor of cardiovascular diseases, such a precise method is required. The constructed method allows the measurement of plasma pyridoxal phosphate, PLP, the co-enzyme form of vitamin B(6), on the same column and similar reagents. The developed method has been successfully applied to measure both total plasma and serum homocysteine in a group of acute stroke patients.

  7. Methodological Issues in the Quantification of Respiratory Sinus Arrhythmia

    PubMed Central

    Denver, John W.; Reed, Shawn F.; Porges, Stephen W.

    2007-01-01

    Although respiratory sinus arrhythmia (RSA) is a commonly quantified physiological variable, the methods for quantification are not consistent. This manuscript questions the assumption that respiration frequency needs to be manipulated or monitored to generate an accurate measure of RSA amplitude. A review of recent papers is presented that contrast RSA amplitude with measures that use respiratory parameters to adjust RSA amplitude. In addition, data from two studies are presented to evaluate empirically both the relation between RSA amplitude and respiration frequency and the covariation between RSA frequency and respiration frequency. The literature review demonstrates similar findings between both classes of measures. The first study demonstrates, during spontaneous breathing without task demands, that there is no relation between respiration frequency and RSA amplitude and that respiration frequency can be accurately derived from the heart period spectrum (i.e., frequency of RSA). The second study demonstrates that respiration frequency is unaffected by atropine dose, a manipulation that systematically mediates the amplitude of RSA, and that the tight linkage between the RSA frequency and respiration frequency is unaffected by atropine. The research shows that the amplitude of RSA is not affected by respiration frequency under either baseline conditions or vagal manipulation via atropine injection. Respiration frequency is therefore unlikely to be a concern under these conditions. Research examining conditions that produce (causal) deviations from the intrinsic relation between respiratory parameters and the amplitude of RSA combined with appropriate statistical procedures for understanding these deviations are necessary. PMID:17067734

  8. RANS turbulence model form uncertainty quantification for wind engineering flows

    NASA Astrophysics Data System (ADS)

    Gorle, Catherine; Zeoli, Stephanie; Bricteux, Laurent

    2016-11-01

    Reynolds-averaged Navier-Stokes simulations with linear eddy-viscosity turbulence models are commonly used for modeling wind engineering flows, but the use of the results for critical design decisions is hindered by the limited capability of the models to correctly predict bluff body flows. A turbulence model form uncertainty quantification (UQ) method to define confidence intervals for the results could remove this limitation, and promising results were obtained in a previous study of the flow in downtown Oklahoma City. The objective of the present study is to further investigate the validity of these results by considering the simplified test case of the flow around a wall-mounted cube. DNS data is used to determine: 1. whether the marker, which identifies regions that deviate from parallel shear flow, is a good indicator for the regions where the turbulence model fails, and 2. which Reynolds stress perturbations, in terms of the tensor magnitude and the eigenvalues and eigenvectors of the normalized anisotropy tensor, can capture the uncertainty in the flow field. A comparison of confidence intervals obtained with the UQ method and the DNS solution indicates that the uncertainty in the velocity field can be captured correctly in a large portion of the flow field.

  9. Quantification of light attenuation in optically cleared mouse brains

    PubMed Central

    d’Esposito, Angela; Nikitichev, Daniil; Desjardins, Adrien; Walker-Samuel, Simon; Lythgoe, Mark F.

    2015-01-01

    Optical clearing, in combination with recently developed optical imaging techniques, enables visualization and acquisition of high resolution, three-dimensional images of biological structures deep within tissue. Many different approaches can be used to reduce light absorption and scattering within the tissue, but there is a paucity of research on the quantification of clearing efficacy. With the use of a custom-made spectroscopy system, we developed a way to quantify the quality of clearing in biological tissue, and applied it to the mouse brain. Three clearing techniques were compared: BABB (Murray’s clear), pBABB (a modification of BABB which includes the use of hydrogen peroxide) and passive CLARITY. Despite being limited to autofluorescence studies, we found that pBABB produced the highest degree of optical clearing. Furthermore, the approach allows regional measurement of light attenuation to be performed, and. our results show that light is most attenuated in regions with high lipid content. This work provides a way to choose between the multiple clearing protocols available, and it could prove useful for evaluating images that are acquired with cleared tissues. PMID:26277988

  10. Novel Parachlamydia acanthamoebae quantification method based on coculture with amoebae.

    PubMed

    Matsuo, Junji; Hayashi, Yasuhiro; Nakamura, Shinji; Sato, Marie; Mizutani, Yoshihiko; Asaka, Masahiro; Yamaguchi, Hiroyuki

    2008-10-01

    Parachlamydia acanthamoebae, belonging to the order Chlamydiales, is an obligately intracellular bacterium that infects free-living amoebae and is a potential human pathogen. However, no method exists to accurately quantify viable bacterial numbers. We present a novel quantification method for P. acanthamoebae based on coculture with amoebae. P. acanthamoebae was cultured either with Acanthamoeba spp. or with mammalian epithelial HEp-2 or Vero cells. The infection rate of P. acanthamoebae (amoeba-infectious dose [AID]) was determined by DAPI (4',6-diamidino-2-phenylindole) staining and was confirmed by fluorescent in situ hybridization. AIDs were plotted as logistic sigmoid dilution curves, and P. acanthamoebae numbers, defined as amoeba-infectious units (AIU), were calculated. During culture, amoeba numbers and viabilities did not change, and amoebae did not change from trophozoites to cysts. Eight amoeba strains showed similar levels of P. acanthamoebae growth, and bacterial numbers reached ca. 1,000-fold (10(9) AIU preculture) after 4 days. In contrast, no increase was observed for P. acanthamoebae in either mammalian cell line. However, aberrant structures in epithelial cells, implying possible persistent infection, were seen by transmission electron microscopy. Thus, our method could monitor numbers of P. acanthamoebae bacteria in host cells and may be useful for understanding chlamydiae present in the natural environment as human pathogens.

  11. Methane Leak Detection and Emissions Quantification with UAVs

    NASA Astrophysics Data System (ADS)

    Barchyn, T.; Fox, T. A.; Hugenholtz, C.

    2016-12-01

    Robust leak detection and emissions quantification algorithms are required to accurately monitor greenhouse gas emissions. Unmanned aerial vehicles (UAVs, `drones') could both reduce the cost and increase the accuracy of monitoring programs. However, aspects of the platform create unique challenges. UAVs typically collect large volumes of data that are close to source (due to limited range) and often lower quality (due to weight restrictions on sensors). Here we discuss algorithm development for (i) finding sources of unknown position (`leak detection') and (ii) quantifying emissions from a source of known position. We use data from a simulated leak and field study in Alberta, Canada. First, we detail a method for localizing a leak of unknown spatial location using iterative fits against a forward Gaussian plume model. We explore sources of uncertainty, both inherent to the method and operational. Results suggest this method is primarily constrained by accurate wind direction data, distance downwind from source, and the non-Gaussian shape of close range plumes. Second, we examine sources of uncertainty in quantifying emissions with the mass balance method. Results suggest precision is constrained by flux plane interpolation errors and time offsets between spatially adjacent measurements. Drones can provide data closer to the ground than piloted aircraft, but large portions of the plume are still unquantified. Together, we find that despite larger volumes of data, working with close range plumes as measured with UAVs is inherently difficult. We describe future efforts to mitigate these challenges and work towards more robust benchmarking for application in industrial and regulatory settings.

  12. Micelle Mediated Trace Level Sulfide Quantification through Cloud Point Extraction

    PubMed Central

    Devaramani, Samrat; Malingappa, Pandurangappa

    2012-01-01

    A simple cloud point extraction protocol has been proposed for the quantification of sulfide at trace level. The method is based on the reduction of iron (III) to iron (II) by the sulfide and the subsequent complexation of metal ion with nitroso-R salt in alkaline medium. The resulting green-colored complex was extracted through cloud point formation using cationic surfactant, that is, cetylpyridinium chloride, and the obtained surfactant phase was homogenized by ethanol before its absorbance measurement at 710 nm. The reaction variables like metal ion, ligand, surfactant concentration, and medium pH on the cloud point extraction of the metal-ligand complex have been optimized. The interference effect of the common anions and cations was studied. The proposed method has been successfully applied to quantify the trace level sulfide in the leachate samples of the landfill and water samples from bore wells and ponds. The validity of the proposed method has been studied by spiking the samples with known quantities of sulfide as well as comparing with the results obtained by the standard method. PMID:22619597

  13. Toward automated quantification of biological microstructures using unbiased stereology

    NASA Astrophysics Data System (ADS)

    Bonam, Om P.; Elozory, Daniel; Kramer, Kurt; Goldgof, Dmitry; Hall, Lawrence O.; Mangual, Osvaldo; Mouton, Peter R.

    2011-03-01

    Quantitative analysis of biological microstructures using unbiased stereology plays a large and growing role in bioscience research. Our aim is to add a fully automatic, high-throughput mode to a commercially available, computerized stereology device (Stereologer). The current method for estimation of first- and second order parameters of biological microstructures, requires a trained user to manually select objects of interest (cells, fibers etc.,) while stepping through the depth of a stained tissue section in fixed intervals. The proposed approach uses a combination of color and gray-level processing. Color processing is used to identify the objects of interest, by training on the images to obtain the threshold range for objects of interest. In gray-level processing, a region-growing approach was used to assign a unique identity to the objects of interest and enumerate them. This automatic approach achieved an overall object detection rate of 93.27%. Thus, these results support the view that automatic color and gray-level processing combined with unbiased sampling and assumption and model-free geometric probes can provide accurate and efficient quantification of biological objects.

  14. Stochastic modeling for magnetic resonance quantification of myocardial blood flow

    NASA Astrophysics Data System (ADS)

    Seethamraju, Ravi T.; Muehling, Olaf; Panse, Prasad M.; Wilke, Norbert M.; Jerosch-Herold, Michael

    2000-10-01

    Quantification of myocardial blood flow is useful for determining the functional severity of coronary artery lesions. With advances in MR imaging it has become possible to assess myocardial perfusion and blood flow in a non-invasive manner by rapid serial imaging following injection of contrast agent. To date most approaches reported in the literature relied mostly on deriving relative indices of myocardial perfusion directly from the measured signal intensity curves. The central volume principle on the other hand states that it is possible to derive absolute myocardial blood flow from the tissue impulse response. Because of the sensitivity involved in deconvolution due to noise in measured data, conventional methods are sub-optimal, hence, we propose to use stochastic time series modeling techniques like ARMA to obtain a robust impulse response estimate. It is shown that these methods when applied for the optical estimation of the transfer function give accurate estimates of myocardial blood flow. The most significant advantage of this approach, compared with compartmental tracer kinetic models, is the use of a minimum set of prior assumptions on data. The bottleneck in assessing myocardial blood flow, does not lie in the MRI acquisition, but rather in the effort or time for post processing. It is anticipated that the very limited requirements for user input and interaction will be of significant advantage for the clinical application of these methods. The proposed methods are validated by comparison with mean blood flow measurements obtained from radio-isotope labeled microspheres.

  15. Comprehensive quantification of ceramide species in human stratum corneum.

    PubMed

    Masukawa, Yoshinori; Narita, Hirofumi; Sato, Hirayuki; Naoe, Ayano; Kondo, Naoki; Sugai, Yoshiya; Oba, Tsuyoshi; Homma, Rika; Ishikawa, Junko; Takagi, Yutaka; Kitahara, Takashi

    2009-08-01

    One of the key challenges in lipidomics is to quantify lipidomes of interest, as it is practically impossible to collect all authentic materials covering the targeted lipidomes. For diverse ceramides (CER) in human stratum corneum (SC) that play important physicochemical roles in the skin, we developed a novel method for quantification of the overall CER species by improving our previously reported profiling technique using normal-phase liquid chromatography-electrospray ionization-mass spectrometry (NPLC-ESI-MS). The use of simultaneous selected ion monitoring measurement of as many as 182 kinds of molecular-related ions enables the highly sensitive detection of the overall CER species, as they can be analyzed in only one SC-stripped tape as small as 5 mm x 10 mm. To comprehensively quantify CERs, including those not available as authentic species, we designed a procedure to estimate their levels using relative responses of representative authentic species covering the species targeted, considering the systematic error based on intra-/inter-day analyses. The CER levels obtained by this method were comparable to those determined by conventional thin-layer chromatography (TLC), which guarantees the validity of this method. This method opens lipidomics approaches for CERs in the SC.

  16. Uncertainty quantification for personalized analyses of human proximal femurs.

    PubMed

    Wille, Hagen; Ruess, Martin; Rank, Ernst; Yosibash, Zohar

    2016-02-29

    Computational models for the personalized analysis of human femurs contain uncertainties in bone material properties and loads, which affect the simulation results. To quantify the influence we developed a probabilistic framework based on polynomial chaos (PC) that propagates stochastic input variables through any computational model. We considered a stochastic E-ρ relationship and a stochastic hip contact force, representing realistic variability of experimental data. Their influence on the prediction of principal strains (ϵ1 and ϵ3) was quantified for one human proximal femur, including sensitivity and reliability analysis. Large variabilities in the principal strain predictions were found in the cortical shell of the femoral neck, with coefficients of variation of ≈40%. Between 60 and 80% of the variance in ϵ1 and ϵ3 are attributable to the uncertainty in the E-ρ relationship, while ≈10% are caused by the load magnitude and 5-30% by the load direction. Principal strain directions were unaffected by material and loading uncertainties. The antero-superior and medial inferior sides of the neck exhibited the largest probabilities for tensile and compression failure, however all were very small (pf<0.001). In summary, uncertainty quantification with PC has been demonstrated to efficiently and accurately describe the influence of very different stochastic inputs, which increases the credibility and explanatory power of personalized analyses of human proximal femurs.

  17. Quantification of Nociceptive Escape Response in C.elegans

    NASA Astrophysics Data System (ADS)

    Leung, Kawai; Mohammadi, Aylia; Ryu, William; Nemenman, Ilya

    2013-03-01

    Animals cannot rank and communicate their pain consciously. Thus in pain studies on animal models, one must infer the pain level from high precision experimental characterization of behavior. This is not trivial since behaviors are very complex and multidimensional. Here we explore the feasibility of C.elegans as a model for pain transduction. The nematode has a robust neurally mediated noxious escape response, which we show to be partially decoupled from other sensory behaviors. We develop a nociceptive behavioral response assay that allows us to apply controlled levels of pain by locally heating worms with an IR laser. The worms' motions are captured by machine vision programming with high spatiotemporal resolution. The resulting behavioral quantification allows us to build a statistical model for inference of the experienced pain level from the behavioral response. Based on the measured nociceptive escape of over 400 worms, we conclude that none of the simple characteristics of the response are reliable indicators of the laser pulse strength. Nonetheless, a more reliable statistical inference of the pain stimulus level from the measured behavior is possible based on a complexity-controlled regression model that takes into account the entire worm behavioral output. This work was partially supported by NSF grant No. IOS/1208126 and HFSP grant No. RGY0084/2011.

  18. Uncertainty Quantification for Monitoring of Civil Structures from Vibration Measurements

    NASA Astrophysics Data System (ADS)

    Döhler, Michael; Mevel, Laurent

    2014-05-01

    Health Monitoring of civil structures can be performed by detecting changes in the modal parameters of a structure, or more directly in the measured vibration signals. For a continuous monitoring the excitation of a structure is usually ambient, thus unknown and assumed to be noise. Hence, all estimates from the vibration measurements are realizations of random variables with inherent uncertainty due to (unknown) process and measurement noise and finite data length. In this talk, a strategy for quantifying the uncertainties of modal parameter estimates from a subspace-based system identification approach is presented and the importance of uncertainty quantification in monitoring approaches is shown. Furthermore, a damage detection method is presented, which is based on the direct comparison of the measured vibration signals without estimating modal parameters, while taking the statistical uncertainty in the signals correctly into account. The usefulness of both strategies is illustrated on data from a progressive damage action on a prestressed concrete bridge. References E. Carden and P. Fanning. Vibration based condition monitoring: a review. Structural Health Monitoring, 3(4):355-377, 2004. M. Döhler and L. Mevel. Efficient multi-order uncertainty computation for stochastic subspace identification. Mechanical Systems and Signal Processing, 38(2):346-366, 2013. M. Döhler, L. Mevel, and F. Hille. Subspace-based damage detection under changes in the ambient excitation statistics. Mechanical Systems and Signal Processing, 45(1):207-224, 2014.

  19. Evaluation of SPECT quantification of radiopharmaceutical distribution in canine myocardium

    SciTech Connect

    Li, Jianying; Jaszczak, R.L.; Greer, K.L.

    1995-02-01

    This study evaluates the quantitative accuracy of SPECT for in vivo distributions of {sup 99m}Tc radiopharmaceuticals using fanbeam (FB) and parallel-beam (PB) collimators and compares uniform and nouniform attenuation correction methods in terms of quantitative accuracy. SPECT quantification of canine myocardial radioactivity was performed followed by well counter measurements of extracted myocardial tissue samples. Transmission scans using a line source and an FB collimator were performed to generate nonuniform attenuation maps of the canine thorax. Emission scans with two energy windows were acquired. Images were reconstructed using a filtered backprojection algorithm, with a dual-window scatter subtraction combined with either no attenuation compensation or single iteration Chang attenuation compensation based on a uniform attenuation map {mu}=0.152 cm{sup -1} or the nonuniform transmission map. The measured mean counts from the SPECT images were converted using the well counter. The experimental results demonstrate that, compared with well counter values, the in vivo distributions of {sup 99m}Tc were most accurately determined in FB and PB SPECT reconstructions with nonuniform attenuation compensation, under-estimated without attenuation compensation and overestimated with uniform attenuation compensation. 37 refs., 9 figs., 10 tabs.

  20. Quantification of Covariance in Tropical Cyclone Activity across Teleconnected Basins

    NASA Astrophysics Data System (ADS)

    Tolwinski-Ward, S. E.; Wang, D.

    2015-12-01

    Rigorous statistical quantification of natural hazard covariance across regions has important implications for risk management, and is also of fundamental scientific interest. We present a multivariate Bayesian Poisson regression model for inferring the covariance in tropical cyclone (TC) counts across multiple ocean basins and across Saffir-Simpson intensity categories. Such covariability results from the influence of large-scale modes of climate variability on local environments that can alternately suppress or enhance TC genesis and intensification, and our model also simultaneously quantifies the covariance of TC counts with various climatic modes in order to deduce the source of inter-basin TC covariability. The model explicitly treats the time-dependent uncertainty in observed maximum sustained wind data, and hence the nominal intensity category of each TC. Differences in annual TC counts as measured by different agencies are also formally addressed. The probabilistic output of the model can be probed for probabilistic answers to such questions as: - Does the relationship between different categories of TCs differ statistically by basin? - Which climatic predictors have significant relationships with TC activity in each basin? - Are the relationships between counts in different basins conditionally independent given the climatic predictors, or are there other factors at play affecting inter-basin covariability? - How can a portfolio of insured property be optimized across space to minimize risk? Although we present results of our model applied to TCs, the framework is generalizable to covariance estimation between multivariate counts of natural hazards across regions and/or across peril types.

  1. Uncertainty quantification in the catalytic partial oxidation of methane

    NASA Astrophysics Data System (ADS)

    Navalho, Jorge E. P.; Pereira, José M. C.; Ervilha, Ana R.; Pereira, José C. F.

    2013-12-01

    This work focuses on uncertainty quantification of eight random parameters required as input for 1D modelling of methane catalytic partial oxidation within a highly dense foam reactor. Parameters related to geometrical properties, reactor thermophysics and catalyst loading are taken as uncertain. A widely applied 1D heterogeneous mathematical model that accounts for proper transport and surface chemistry steps is considered for the evaluation of deterministic samples. The non-intrusive spectral projection approach based on polynomial chaos expansion is applied to determine the stochastic temperature and species profiles along the reactor axial direction as well as their ensemble mean and error bars with a confidence interval of 95%. Probability density functions of relevant variables in specific reactor sections are also analysed. A different contribution is noticed from each random input to the total uncertainty range. Porosity, specific surface area and catalyst loading appear as the major sources of uncertainty to bulk gas and surface temperature and species molar profiles. Porosity and the mean pore diameter have an important impact on the pressure drop along the whole reactor as expected. It is also concluded that any trace of uncertainty in the eight input random variables can be almost dissipated near the catalyst outlet section for a long-enough catalyst, mainly due to the approximation to thermodynamic equilibrium.

  2. Elemental quantification using multiple-energy x-ray absorptiometry

    NASA Astrophysics Data System (ADS)

    Kozul, N.; Davis, G. R.; Anderson, P.; Elliott, J. C.

    1999-03-01

    A novel implementation of multiple-energy x-ray absorptiometry (MEXA) for elemental quantification has been developed. Species are resolved on the basis of their differential attenuation spectra across a wide energy range, ideally including absorption edges. By measuring the incident and exiting x-ray spectra and using known values of mass attenuation coefficients over selected energy bands, the density line integral of the species along the x-ray path can be calculated from all the selected energy channels simultaneously by non-linear least squares methods. Effects of `escape' peak phenomena are modelled and corrections for them are included in the MEXA software. The applications of MEXA are illustrated by single measurements on aluminium and zirconium foils, quantitation of aqueous KI diffusing into a porous solid, simultaneous measurement of acidic diffusant 0957-0233/10/3/023/img1 and porous solid with which it reacts and which it dissolves and microtomographic reconstructions of liquid and solid specimens containing caesium and/or iodine.

  3. Dynamic control and quantification of bacterial population dynamics in droplets.

    PubMed

    Huang, Shuqiang; Srimani, Jaydeep K; Lee, Anna J; Zhang, Ying; Lopatkin, Allison J; Leong, Kam W; You, Lingchong

    2015-08-01

    Culturing and measuring bacterial population dynamics are critical to develop insights into gene regulation or bacterial physiology. Traditional methods, based on bulk culture to obtain such quantification, have the limitations of higher cost/volume of reagents, non-amendable to small size of population and more laborious manipulation. To this end, droplet-based microfluidics represents a promising alternative that is cost-effective and high-throughput. However, difficulties in manipulating the droplet environment and monitoring encapsulated bacterial population for long-term experiments limit its utilization. To overcome these limitations, we used an electrode-free injection technology to modulate the chemical environment in droplets. This ability is critical for precise control of bacterial dynamics in droplets. Moreover, we developed a trapping device for long-term monitoring of population dynamics in individual droplets for at least 240 h. We demonstrated the utility of this new microfluidic system by quantifying population dynamics of natural and engineered bacteria. Our approach can further improve the analysis for systems and synthetic biology in terms of manipulability and high temporal resolution.

  4. Fourier transform infrared quantification of sugars in pretreated biomass liquors.

    PubMed

    Tucker, M P; Mitri, R K; Eddy, F P; Nguyen, Q A; Gedvilas, L M; Webb, J D

    2000-01-01

    The process of converting renewable lignocellulosic biomass to ethanol requires a number of steps, and pretreatment is one of the most important. Pretreatment usually involves a hydrolysis of the easily hydrolyzed hemi-cellulosic component of biomass using some form of thermal/chemical/mechanical action that results in a product that can be further hydrolyzed by cellulase enzymes (the cellulosic portion). The sugars produced can then be fermented to ethanol by fermentative microorganisms. If the pretreatment step is not severe enough, the resultant residue is not as easily hydrolyzed by the cellulase enzyme. More severe pretreatment conditions result in the production of degradation products that are toxic to the fermentative microorganism. In this article, we report the quantitative analysis of glucose, mannose, xylose, and acetic acid using Fourier transform infrared (FTIR) spectroscopy on liquors from dilute-acid-pretreated soft-wood and hard-wood slurries. Comparison of FTIR and high-performance liquid chromatography quantitative analyses of these liquors are reported. Recent developments in infrared probe technology has enabled the rapid quantification of these sugars by FTIR spectroscopy in the batch reactor during optimization of the pretreatment conditions, or interfaced to the computer controlling a continuous reactor for on-line monitoring and control.

  5. A surrogate-based uncertainty quantification with quantifiable errors

    SciTech Connect

    Bang, Y.; Abdel-Khalik, H. S.

    2012-07-01

    Surrogate models are often employed to reduce the computational cost required to complete uncertainty quantification, where one is interested in propagating input parameters uncertainties throughout a complex engineering model to estimate responses uncertainties. An improved surrogate construction approach is introduced here which places a premium on reducing the associated computational cost. Unlike existing methods where the surrogate is constructed first, then employed to propagate uncertainties, the new approach combines both sensitivity and uncertainty information to render further reduction in the computational cost. Mathematically, the reduction is described by a range finding algorithm that identifies a subspace in the parameters space, whereby parameters uncertainties orthogonal to the subspace contribute negligible amount to the propagated uncertainties. Moreover, the error resulting from the reduction can be upper-bounded. The new approach is demonstrated using a realistic nuclear assembly model and compared to existing methods in terms of computational cost and accuracy of uncertainties. Although we believe the algorithm is general, it will be applied here for linear-based surrogates and Gaussian parameters uncertainties. The generalization to nonlinear models will be detailed in a separate article. (authors)

  6. Interactive image quantification tools in nuclear material forensics

    SciTech Connect

    Porter, Reid B; Ruggiero, Christy; Hush, Don; Harvey, Neal; Kelly, Pat; Scoggins, Wayne; Tandon, Lav

    2011-01-03

    Morphological and microstructural features visible in microscopy images of nuclear materials can give information about the processing history of a nuclear material. Extraction of these attributes currently requires a subject matter expert in both microscopy and nuclear material production processes, and is a time consuming, and at least partially manual task, often involving multiple software applications. One of the primary goals of computer vision is to find ways to extract and encode domain knowledge associated with imagery so that parts of this process can be automated. In this paper we describe a user-in-the-loop approach to the problem which attempts to both improve the efficiency of domain experts during image quantification as well as capture their domain knowledge over time. This is accomplished through a sophisticated user-monitoring system that accumulates user-computer interactions as users exploit their imagery. We provide a detailed discussion of the interactive feature extraction and segmentation tools we have developed and describe our initial results in exploiting the recorded user-computer interactions to improve user productivity over time.

  7. Electrochemical Quantification of Single Nucleotide Polymorphisms Using Nanoparticle Probes

    SciTech Connect

    Liu, Guodong; Lin, Yuehe

    2007-08-29

    We report a new approach for electrochemical quantification of single-nucleotide polymorphisms (SNPs) using nanoparticle probes. The principle is based on DNA polymerase I (klenow fragment)-induced coupling of the nucleotide-modified nanoparticle probe to the mutant sites of duplex DNA under the Watson-Crick base pairing rule. After liquid hybridization events occurred among biotinylated DNA probes, mutant DNA, and complementary DNA, the resulting duplex DNA helixes were captured to the surface of magnetic beads through a biotin-avidin affinity reaction and magnetic separation. A cadmium phosphate-loaded apoferritin nanoparticle probe, which is modified with nucleotides and is complementary to the mutant site, is coupled to the mutant sites of the formed duplex DNA in the presence of DNA polymerase. Subsequent electrochemical stripping analysis of the cadmium component of coupled nanoparticle probes provides a means to quantify the concentration of mutant DNA. The method is sensitive enough to detect 21.5 attomol mutant DNA, which will enable the quantitative analysis of nucleic acid without polymerase chain reaction pre-amplification. The approach was challenged with constructed samples containing mutant and complementary DNA. The results indicated that it was possible to accurately determine SNPs with frequencies as low 0.01. The proposed approach has a great potential for realizing an accurate, sensitive, rapid, and low-cost method of SNP detection.

  8. Quantification of nortriptyline in plasma by HPLC and fluorescence detection.

    PubMed

    Almudever, Patricia; Peris, José-Esteban; Garrigues, Teresa; Diez, Octavio; Melero, Ana; Alós, Manuel

    2010-03-15

    A simple, sensitive and specific high-performance liquid chromatography method has been developed for the determination of nortriptyline (NT) in plasma samples. The assay involved derivatization with 9H-fluoren-9-ylmethyl chloroformate (Fmoc-Cl) and isocratic reversed-phase (C(18)) chromatography with fluorescence detection. The developed method required only 100 microl of plasma sample, deproteinized and derivatized in one step. Calibration curves were lineal over the concentration range of 5-5000 ng/ml. The derivatization reaction was performed at room temperature in 20 min and the obtained NT derivative was stable for at least 48 h at room temperature. The within-day and between-day relative standard deviation was below 8%. The limit of detection (LOD) was 2 ng/ml, and the lower limit of quantification (LLOQ) was established at 10 ng/ml. The method was applied on plasma collected from rats, at different time intervals, after intravenous administration of 0.5 mg of NT.

  9. Accurate quantification of astaxanthin from Haematococcus crude extract spectrophotometrically

    NASA Astrophysics Data System (ADS)

    Li, Yeguang; Miao, Fengping; Geng, Yahong; Lu, Dayan; Zhang, Chengwu; Zeng, Mingtao

    2012-07-01

    The influence of alkali on astaxanthin and the optimal working wave length for measurement of astaxanthin from Haematococcus crude extract were investigated, and a spectrophotometric method for precise quantification of the astaxanthin based on the method of Boussiba et al. was established. According to Boussiba's method, alkali treatment destroys chlorophyll. However, we found that: 1) carotenoid content declined for about 25% in Haematococcus fresh cysts and up to 30% in dry powder of Haematococcus broken cysts after alkali treatment; and 2) dimethyl sulfoxide (DMSO)-extracted chlorophyll of green Haematococcus bares little absorption at 520-550 nm. Interestingly, a good linear relationship existed between absorbance at 530 nm and astaxanthin content, while an unknown interference at 540-550 nm was detected in our study. Therefore, with 530 nm as working wavelength, the alkali treatment to destroy chlorophyll was not necessary and the influence of chlorophyll, other carotenoids, and the unknown interference could be avoided. The astaxanthin contents of two samples were measured at 492 nm and 530 nm; the measured values at 530 nm were 2.617 g/100 g and 1.811 g/100 g. When compared with the measured values at 492 nm, the measured values at 530 nm decreased by 6.93% and 11.96%, respectively. The measured values at 530 nm are closer to the true astaxanthin contents in the samples. The data show that 530 nm is the most suitable wave length for spectrophotometric determination to the astaxanthin in Haematococcus crude extract.

  10. Quantification of HBsAg: basic virology for clinical practice.

    PubMed

    Lee, Jung Min; Ahn, Sang Hoon

    2011-01-21

    Hepatitis B surface antigen (HBsAg) is produced and secreted through a complex mechanism that is still not fully understood. In clinical fields, HBsAg has long served as a qualitative diagnostic marker for hepatitis B virus infection. Notably, advances have been made in the development of quantitative HBsAg assays, which have allowed viral replication monitoring, and there is an opportunity to make maximal use of quantitative HBsAg to elucidate its role in clinical fields. Yet, it needs to be underscored that a further understanding of HBsAg, not only from clinical point of view but also from a virologic point of view, would enable us to deepen our insights, so that we could more widely expand and apply its utility. It is also important to be familiar with HBsAg variants and their clinical consequences in terms of immune escape mutants, issues resulting from overlap with corresponding mutation in the P gene, and detection problems for the HBsAg variants. In this article, we review current concepts and issues on the quantification of HBsAg titers with respect to their biologic nature, method principles, and clinically relevant topics.

  11. Impact Induced Delamination Detection and Quantification With Guided Wavefield Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara A. C.; Yu, Lingyu; Seebo, Jeffrey P.

    2015-01-01

    This paper studies impact induced delamination detection and quantification by using guided wavefield data and spatial wavenumber imaging. The complex geometry impact-like delamination is created through a quasi-static indentation on a CFRP plate. To detect and quantify the impact delamination in the CFRP plate, PZT-SLDV sensing and spatial wavenumber imaging are performed. In the PZT-SLDV sensing, the guided waves are generated from the PZT, and the high spatial resolution guided wavefields are measured by the SLDV. The guided wavefield data acquired from the PZT-SLDV sensing represent guided wave propagation in the composite laminate and include guided wave interaction with the delamination damage. The measured guided wavefields are analyzed through the spatial wavenumber imaging method, which generates an image containing the dominant local wavenumber at each spatial location. The spatial wavenumber imaging result for the simple single layer Teflon insert delamination provided quantitative information on delamination damage size and location. The location of delamination damage is indicated by the area with larger wavenumbers in the spatial wavenumber image. The impact-like delamination results only partially agreed with the damage size and shape. The results also demonstrated the dependence on excitation frequency. Future work will further investigate the accuracy of the wavenumber imaging method for real composite damage and the dependence on frequency of excitation.

  12. Quantification of motility of carabid beetles in farmland.

    PubMed

    Allema, A B; van der Werf, W; Groot, J C J; Hemerik, L; Gort, G; Rossing, W A H; van Lenteren, J C

    2015-04-01

    Quantification of the movement of insects at field and landscape levels helps us to understand their ecology and ecological functions. We conducted a meta-analysis on movement of carabid beetles (Coleoptera: Carabidae), to identify key factors affecting movement and population redistribution. We characterize the rate of redistribution using motility μ (L2 T-1), which is a measure for diffusion of a population in space and time that is consistent with ecological diffusion theory and which can be used for upscaling short-term data to longer time frames. Formulas are provided to calculate motility from literature data on movement distances. A field experiment was conducted to measure the redistribution of mass-released carabid, Pterostichus melanarius in a crop field, and derive motility by fitting a Fokker-Planck diffusion model using inverse modelling. Bias in estimates of motility from literature data is elucidated using the data from the field experiment as a case study. The meta-analysis showed that motility is 5.6 times as high in farmland as in woody habitat. Species associated with forested habitats had greater motility than species associated with open field habitats, both in arable land and woody habitat. The meta-analysis did not identify consistent differences in motility at the species level, or between clusters of larger and smaller beetles. The results presented here provide a basis for calculating time-varying distribution patterns of carabids in farmland and woody habitat. The formulas for calculating motility can be used for other taxa.

  13. Pesticide residue quantification analysis by hyperspectral imaging sensors

    NASA Astrophysics Data System (ADS)

    Liao, Yuan-Hsun; Lo, Wei-Sheng; Guo, Horng-Yuh; Kao, Ching-Hua; Chou, Tau-Meu; Chen, Junne-Jih; Wen, Chia-Hsien; Lin, Chinsu; Chen, Hsian-Min; Ouyang, Yen-Chieh; Wu, Chao-Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2015-05-01

    Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.

  14. Uncertainty quantification applied to the mode coupling phenomenon

    NASA Astrophysics Data System (ADS)

    Treimer, Martin; Allert, Baldur; Dylla, Katrin; Müller, Gerhard

    2017-02-01

    In this study a method for the uncertainty quantification of friction induced vibrations based on the mode coupling phenomenon is shown. The main focus is the assessment of the phenomenon under consideration of uncertain input parameters for the robustness evaluation. Stability assessments of the system under parameter scatter are given. It is pointed out how this is implemented within the scope of the Finite Element method. On the basis of the Euler-Bernoulli beam as a proof-of-concept model a procedure for the assessment of the system's robustness is shown. An objective function is proposed and used to evaluate a design of experiment. By means of a regression analysis an indicator for the robustness of the system is given. Numerical results are presented on the basis of the Euler-Bernoulli beam and a Finite Element brake model. A universal procedure is shown, the approach of which can be used for robustness assessments in different fields of interest. The algorithm that has an optimal efficiency is validated by a comparison with an algorithm which has an optimal quality of prediction. The procedure is applied on the robustness' assessment of brake squeal.

  15. Automated gold particle quantification of immunogold labeled micrographs.

    PubMed

    Enger, Rune

    2017-07-15

    Immunogold cytochemistry is the method of choice for precise localization of antigens on a subcellular scale. The process of immunogold quantification in electron micrographs is laborious, especially for proteins with a dense distribution pattern. Here I present a MATLAB based toolbox that is optimized for a typical immunogold analysis workflow. It combines automatic detection of gold particles through a multi-threshold algorithm with manual segmentation of cell membranes and regions of interests. The automated particle detection algorithm was applied to a typical immunogold dataset of neural tissue, and was able to detect particles with a high degree of precision. Without manual correction, the algorithm detected 97% of all gold particles, with merely a 0.1% false-positive rate. To my knowledge, this is the first free and publicly available software custom made for immunogold analyses. The proposed particle detection method compares favorably to previously published algorithms. The software presented here will be valuable tool for researchers in neuroscience working with immunogold cytochemistry. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  16. Quantification of intracellular payload release from polymersome nanoparticles

    PubMed Central

    Scarpa, Edoardo; Bailey, Joanne L.; Janeczek, Agnieszka A.; Stumpf, Patrick S.; Johnston, Alexander H.; Oreffo, Richard O. C.; Woo, Yin L.; Cheong, Ying C.; Evans, Nicholas D.; Newman, Tracey A.

    2016-01-01

    Polymersome nanoparticles (PMs) are attractive candidates for spatio-temporal controlled delivery of therapeutic agents. Although many studies have addressed cellular uptake of solid nanoparticles, there is very little data available on intracellular release of molecules encapsulated in membranous carriers, such as polymersomes. Here, we addressed this by developing a quantitative assay based on the hydrophilic dye, fluorescein. Fluorescein was encapsulated stably in PMs of mean diameter 85 nm, with minimal leakage after sustained dialysis. No fluorescence was detectable from fluorescein PMs, indicating quenching. Following incubation of L929 cells with fluorescein PMs, there was a gradual increase in intracellular fluorescence, indicating PM disruption and cytosolic release of fluorescein. By combining absorbance measurements with flow cytometry, we quantified the real-time intracellular release of a fluorescein at a single-cell resolution. We found that 173 ± 38 polymersomes released their payload per cell, with significant heterogeneity in uptake, despite controlled synchronisation of cell cycle. This novel method for quantification of the release of compounds from nanoparticles provides fundamental information on cellular uptake of nanoparticle-encapsulated compounds. It also illustrates the stochastic nature of population distribution in homogeneous cell populations, a factor that must be taken into account in clinical use of this technology. PMID:27404770

  17. SAKE: a new quantification tool for positron emission tomography studies.

    PubMed

    Veronese, Mattia; Rizzo, Gaia; Turkheimer, Federico E; Bertoldo, Alessandra

    2013-07-01

    In dynamic positron emission tomography (PET) studies, spectral analysis (SA) refers to a data-driven quantification method, based on a single-input single-output model for which the transfer function is described by a sum of exponential terms. SA allows to quantify numerosities, amplitudes and eigenvalues of the transfer function allowing, in this way, to separate kinetic components of the tissue tracer activity with minimal model assumptions. The SA model can be solved with a linear estimator alone or with numerical filters, resulting in different types of SA approaches. Once estimated the number, amplitudes and eigenvalues of the transfer function, one can distinguish the presence in the system of irreversible and/or reversible components as well as derive parameters of physiological significance. These characteristics make it an appealing alternative method to compartmental models which are widely used for the quantitative analysis of dynamic studies acquired with PET. However, despite its applicability to a large number of PET tracers, its implementation is not straightforward and its utilization in the nuclear medicine community has been limited especially by the lack of an user-friendly software application. In this paper we proposed SAKE, a computer program for the quantitative analysis of PET data through the main SA methods. SAKE offers a unified pipeline of analysis usable also by people with limited computer knowledge but with high interest in SA. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Quantification of acidic compounds in complex biomass-derived streams

    SciTech Connect

    Karp, Eric M.; Nimlos, Claire T.; Deutch, Steve; Salvachúa, Davinia; Cywar, Robin M.; Beckham, Gregg T.

    2016-05-10

    Biomass-derived streams that contain acidic compounds from the degradation of lignin and polysaccharides (e.g. black liquor, pyrolysis oil, pyrolytic lignin, etc.) are chemically complex solutions prone to instability and degradation during analysis, making quantification of compounds within them challenging. Here we present a robust analytical method to quantify acidic compounds in complex biomass-derived mixtures using ion exchange, sample reconstitution in pyridine and derivatization with BSTFA. The procedure is based on an earlier method originally reported for kraft black liquors and, in this work, is applied to identify and quantify a large slate of acidic compounds in corn stover derived alkaline pretreatment liquor (APL) as a function of pretreatment severity. Analysis of the samples is conducted with GCxGC-TOFMS to achieve good resolution of the components within the complex mixture. The results reveal the dominant low molecular weight components and their concentrations as a function of pretreatment severity. Application of this method is also demonstrated in the context of lignin conversion technologies by applying it to track the microbial conversion of an APL substrate. Here as well excellent results are achieved, and the appearance and disappearance of compounds is observed in agreement with the known metabolic pathways of two bacteria, indicating the sample integrity was maintained throughout analysis. Finally, it is shown that this method applies more generally to lignin-rich materials by demonstrating its usefulness in analysis of pyrolysis oil and pyrolytic lignin.

  19. Uranium quantification in semen by inductively coupled plasma mass spectrometry

    USGS Publications Warehouse

    Todorov, Todor; Ejnik, John W.; Guandalini, Gustavo S.; Xu, Hanna; Hoover, Dennis; Anderson, Larry W.; Squibb, Katherine; McDiarmid, Melissa A.; Centeno, Jose A.

    2013-01-01

    In this study we report uranium analysis for human semen samples. Uranium quantification was performed by inductively coupled plasma mass spectrometry. No additives, such as chymotrypsin or bovine serum albumin, were used for semen liquefaction, as they showed significant uranium content. For method validation we spiked 2 g aliquots of pooled control semen at three different levels of uranium: low at 5 pg/g, medium at 50 pg/g, and high at 1000 pg/g. The detection limit was determined to be 0.8 pg/g uranium in human semen. The data reproduced within 1.4–7% RSD and spike recoveries were 97–100%. The uranium level of the unspiked, pooled control semen was 2.9 pg/g of semen (n = 10). In addition six semen samples from a cohort of Veterans exposed to depleted uranium (DU) in the 1991 Gulf War were analyzed with no knowledge of their exposure history. Uranium levels in the Veterans’ semen samples ranged from undetectable (<0.8 pg/g) to 3350 pg/g. This wide concentration range for uranium in semen is consistent with known differences in current DU body burdens in these individuals, some of whom have retained embedded DU fragments.

  20. Quantification of the degree of reaction of fly ash

    SciTech Connect

    Ben Haha, M.; De Weerdt, K.; Lothenbach, B.

    2010-11-15

    The quantification of the fly ash (FA) in FA blended cements is an important parameter to understand the effect of the fly ash on the hydration of OPC and on the microstructural development. The FA reaction in two different blended OPC-FA systems was studied using a selective dissolution technique based on EDTA/NaOH, diluted NaOH solution, the portlandite content and by backscattered electron image analysis. The amount of FA determined by selective dissolution using EDTA/NaOH is found to be associated with a significant possible error as different assumptions lead to large differences in the estimate of FA reacted. In addition, at longer hydration times, the reaction of the FA is underestimated by this method due to the presence of non-dissolved hydrates and MgO rich particles. The dissolution of FA in diluted NaOH solution agreed during the first days well with the dissolution as observed by image analysis. At 28 days and longer, the formation of hydrates in the diluted solutions leads to an underestimation. Image analysis appears to give consistent results and to be most reliable technique studied.