Science.gov

Sample records for quantification spatialisation vulnerabilite

  1. Race, space, place: notes on the racialisation and spatialisation of commercial sex work in Dubai, UAE.

    PubMed

    Mahdavi, Pardis

    2010-11-01

    This paper focuses on the perceived racialisation and resultant spatialisation of commercial sex in Dubai. In recent years, the sex industry in Dubai has grown to include women from the Middle East, Eastern Europe, East Asia and Africa. With the increase in sex workers of different nationalities has come a form of localised racism that is embedded in structures and desires seen within specific locations. The physical spatialisation of sex work hinges on perceived race and produces distinct income generating potential for women engaged in the sex industry in Dubai. The social and physical topography of Dubai is important in marginalising or privileging these various groups of sex workers, which correlates race, space and place with rights and assistance. I begin with a description of the multidirectional flows of causality between race, space, place and demand. I then discuss how these various groups are inversely spatialised within the discourse on assistance, protection and rights. The findings presented here are based on ethnographic research conducted with transnational migrants in the UAE in 2004, 2008 and 2009.

  2. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  3. Calculation of Agricultural Nitrogen Quantity for EU15, spatialisation of the results to river basins using CORINE Land Cover

    NASA Astrophysics Data System (ADS)

    Campling, P.; Terres, J. M.; Vandewalle, S.; Crouzet, P.

    2003-04-01

    The objective of the study was the implementation of the OECD/Eurostat Soil Surface balance method to calculate nitrogen balances from agricultural sources for the whole European Union (EU) at administrative or river basins level. This methodology combines the use of statistics on crops area, number of animals together with agronomic technical coefficients and Corine Land Cover data to spatialise agricultural nitrogen quantity through spatial modelling. Results on catchments show an EU average surplus level of 60 kg N/ha. The distribution of the balances shows high surplus amounts in regions of intensive livestock farming (Flanders (B), the Netherlands, Brittany (FR)), and low or deficit values in the central areas of Spain, France and Italy. The effect of the Corine Land Cover in nitrogen balance calculations was also examined through scenario analysis. These simulations indicated a slight improvement in estimation when the Corine Land Cover was used to spatialise the results of the Soil surface balance model. A sensitivity analysis of the technical coefficients was also carried out and showed a higher sensitivity of the model to crop related coefficients than manure coefficients. The overall sensitivity analysis revealed the need to improve the quality of the technical coefficients, requiring more consistency and harmonisation and moreover reflecting regional differences.

  4. A regional estimate of soil organic carbon content linking the RothC model to spatialised climate and soil database

    NASA Astrophysics Data System (ADS)

    Sirca, Costantino; Salis, Michele; Spano, Donatella

    2014-05-01

    Soil organic carbon (SOC) represents the largest pool of organic carbon in the biosphere, and plays a vital role in ecosystem function determining soil fertility, water holding capacity, and susceptibility to land degradation. The SOC amount is mainly led by soil type, land use, and climate. In this work an assessment of SOC pools in Mediterranean soils is presented. The SOC content was estimated at regional scale in Sardinia, the second largest island of the Mediterranean Basin, linking the RothC model (Rothamsted Carbon model) to a high detailed spatialised climate, land use and soil database. More than 300 soil analysis data, covering different land use typologies, were used for the calibration and validation of the model. A good agreement between soil C estimated from the model and ground data was found. The methodology allowed to obtain the current SOC pools estimation for the different land use categories at regional scale.

  5. Quantification of Endogenous Retinoids

    PubMed Central

    Kane, Maureen A.; Napoli, Joseph L.

    2014-01-01

    Numerous physiological processes require retinoids, including development, nervous system function, immune responsiveness, proliferation, differentiation, and all aspects of reproduction. Reliable retinoid quantification requires suitable handling and, in some cases, resolution of geometric isomers that have different biological activities. Here we describe procedures for reliable and accurate quantification of retinoids, including detailed descriptions for handling retinoids, preparing standard solutions, collecting samples and harvesting tissues, extracting samples, resolving isomers, and detecting with high sensitivity. Sample-specific strategies are provided for optimizing quantification. Approaches to evaluate assay performance also are provided. Retinoid assays described here for mice also are applicable to other organisms including zebrafish, rat, rabbit, and human and for cells in culture. Retinoid quantification, especially that of retinoic acid, should provide insight into many diseases, including Alzheimer’s disease, type 2 diabetes, obesity, and cancer. PMID:20552420

  6. Quantification of Secondary Metabolites.

    PubMed

    2016-01-01

    Plants are a rich source of secondary metabolites that have medicinal and aromatic properties. Secondary metabolites such as alkaloids, iridoids and phenolics generally produced by plants for their defence mechanisms have been implicated in the therapeutic properties of most medicinal plants. Hence, quantification of these metabolites will aid to discover new and effective drugs from plant sources and also to scientifically validate the existing traditional practices. Quantification of large group of phytochemicals such as phenolics and flavonoids is quantified in this context.

  7. Quantification of nonclassicality

    NASA Astrophysics Data System (ADS)

    Gehrke, C.; Sperling, J.; Vogel, W.

    2012-11-01

    To quantify single-mode nonclassicality, we start from an operational approach. A positive semidefinite observable is introduced to describe a measurement setup. The quantification is based on the negativity of the normally ordered version of this observable. Perfect operational quantumness corresponds to the quantum-noise-free measurement of the chosen observable. Surprisingly, even moderately squeezed states may exhibit perfect quantumness for a properly designed measurement. The quantification is also considered from an axiomatic viewpoint, based on the algebraic structure of the quantum states and the quantum superposition principle. Basic conclusions from both approaches are consistent with this fundamental principle of the quantum world.

  8. Quantification of Secondary Metabolites.

    PubMed

    2016-01-01

    Plants are a rich source of secondary metabolites that have medicinal and aromatic properties. Secondary metabolites such as alkaloids, iridoids and phenolics generally produced by plants for their defence mechanisms have been implicated in the therapeutic properties of most medicinal plants. Hence, quantification of these metabolites will aid to discover new and effective drugs from plant sources and also to scientifically validate the existing traditional practices. Quantification of large group of phytochemicals such as phenolics and flavonoids is quantified in this context. PMID:26939265

  9. Quantificational logic of context

    SciTech Connect

    Buvac, Sasa

    1996-12-31

    In this paper we extend the Propositional Logic of Context, to the quantificational (predicate calculus) case. This extension is important in the declarative representation of knowledge for two reasons. Firstly, since contexts are objects in the semantics which can be denoted by terms in the language and which can be quantified over, the extension enables us to express arbitrary first-order properties of contexts. Secondly, since the extended language is no longer only propositional, we can express that an arbitrary predicate calculus formula is true in a context. The paper describes the syntax and the semantics of a quantificational language of context, gives a Hilbert style formal system, and outlines a proof of the system`s completeness.

  10. The proteomics quantification dilemma.

    PubMed

    Jungblut, Peter R

    2014-07-31

    Proteomics is dominated today by the protein expression discourse, which favorites the bottom-up approach because of its high throughput and its high sensitivity. For quantification this proceeding is misleading, if a protein is present with more than one protein species in the sample to be analyzed. The protein speciation discourse considers this more realistic situation and affords the top-down procedures or at least a separation of the protein species in advance to identification and quantification. Today all of the top-down procedures are one order of magnitude less sensitive than the bottom-up ones. To increase sensitivity and to increase throughput are major challenges for proteomics of the next years. This article is part of a Special Issue entitled: 20years of Proteomics in memory of Viatliano Pallini. Guest Editors: Luca Bini, Juan J. Calvete, Natacha Turck, Denis Hochstrasser and Jean-Charles Sanchez. PMID:24681132

  11. SERS Quantification of Entacapone Isomers

    NASA Astrophysics Data System (ADS)

    Marković, Marina; Biljan, Tomislav

    2010-08-01

    Raman spectroscopy, due to its non-destructive character and speed, has found widespread use in pharmaceutical applications [1]. It is also being used for quantifying various isomer mixtures, best known being the quantification of xylene isomers [2-3]. Solid-state isomer quantification of entacapone was earlier reported [4]. Here, we report quantification of isomer mixture of an active pharmaceutical substance, in solution, by SERS.

  12. Quantification of human responses

    NASA Technical Reports Server (NTRS)

    Steinlage, R. C.; Gantner, T. E.; Lim, P. Y. W.

    1992-01-01

    Human perception is a complex phenomenon which is difficult to quantify with instruments. For this reason, large panels of people are often used to elicit and aggregate subjective judgments. Print quality, taste, smell, sound quality of a stereo system, softness, and grading Olympic divers and skaters are some examples of situations where subjective measurements or judgments are paramount. We usually express what is in our mind through language as a medium but languages are limited in available choices of vocabularies, and as a result, our verbalizations are only approximate expressions of what we really have in mind. For lack of better methods to quantify subjective judgments, it is customary to set up a numerical scale such as 1, 2, 3, 4, 5 or 1, 2, 3, ..., 9, 10 for characterizing human responses and subjective judgments with no valid justification except that these scales are easy to understand and convenient to use. But these numerical scales are arbitrary simplifications of the complex human mind; the human mind is not restricted to such simple numerical variations. In fact, human responses and subjective judgments are psychophysical phenomena that are fuzzy entities and therefore difficult to handle by conventional mathematics and probability theory. The fuzzy mathematical approach provides a more realistic insight into understanding and quantifying human responses. This paper presents a method for quantifying human responses and subjective judgments without assuming a pattern of linear or numerical variation for human responses. In particular, quantification and evaluation of linguistic judgments was investigated.

  13. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  14. Detection and Quantification of Neurotransmitters in Dialysates

    PubMed Central

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2010-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection). PMID:19575473

  15. Processing and domain selection: Quantificational variability effects

    PubMed Central

    Harris, Jesse A.; Clifton, Charles; Frazier, Lyn

    2014-01-01

    Three studies investigated how readers interpret sentences with variable quantificational domains, e.g., The army was mostly in the capital, where mostly may quantify over individuals or parts (Most of the army was in the capital) or over times (The army was in the capital most of the time). It is proposed that a general conceptual economy principle, No Extra Times (Majewski 2006, in preparation), discourages the postulation of potentially unnecessary times, and thus favors the interpretation quantifying over parts. Disambiguating an ambiguously quantified sentence to a quantification over times interpretation was rated as less natural than disambiguating it to a quantification over parts interpretation (Experiment 1). In an interpretation questionnaire, sentences with similar quantificational variability were constructed so that both interpretations of the sentence would require postulating multiple times; this resulted in the elimination of the preference for a quantification over parts interpretation, suggesting the parts preference observed in Experiment 1 is not reducible to a lexical bias of the adverb mostly (Experiment 2). An eye movement recording study showed that, in the absence of prior evidence for multiple times, readers exhibit greater difficulty when reading material that forces a quantification over times interpretation than when reading material that allows a quantification over parts interpretation (Experiment 3). These experiments contribute to understanding readers’ default assumptions about the temporal properties of sentences, which is essential for understanding the selection of a domain for adverbial quantifiers and, more generally, for understanding how situational constraints influence sentence processing. PMID:25328262

  16. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  17. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  18. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA.

  19. MAMA Software Features: Visual Examples of Quantification

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-20

    This document shows examples of the results from quantifying objects of certain sizes and types in the software. It is intended to give users a better feel for some of the quantification calculations, and, more importantly, to help users understand the challenges with using a small set of ‘shape’ quantification calculations for objects that can vary widely in shapes and features. We will add more examples to this in the coming year.

  20. Quantification of gap junction selectivity.

    PubMed

    Ek-Vitorín, Jose F; Burt, Janis M

    2005-12-01

    Gap junctions, which are essential for functional coordination and homeostasis within tissues, permit the direct intercellular exchange of small molecules. The abundance and diversity of this exchange depends on the number and selectivity of the comprising channels and on the transjunctional gradient for and chemical character of the permeant molecules. Limited knowledge of functionally significant permeants and poor detectability of those few that are known have made it difficult to define channel selectivity. Presented herein is a multifaceted approach to the quantification of gap junction selectivity that includes determination of the rate constant for intercellular diffusion of a fluorescent probe (k2-DYE) and junctional conductance (gj) for each junction studied, such that the selective permeability (k2-DYE/gj) for dyes with differing chemical characteristics or junctions with differing connexin (Cx) compositions (or treatment conditions) can be compared. In addition, selective permeability can be correlated using single-channel conductance when this parameter is also measured. Our measurement strategy is capable of detecting 1) rate constants and selective permeabilities that differ across three orders of magnitude and 2) acute changes in that rate constant. Using this strategy, we have shown that 1) the selective permeability of Cx43 junctions to a small cationic dye varied across two orders of magnitude, consistent with the hypothesis that the various channel configurations adopted by Cx43 display different selective permeabilities; and 2) the selective permeability of Cx37 vs. Cx43 junctions was consistently and significantly lower. PMID:16093281

  1. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  2. Separation and quantification of microalgal carbohydrates.

    PubMed

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse.

  3. Carotid intraplaque neovascularization quantification software (CINQS).

    PubMed

    Akkus, Zeynettin; van Burken, Gerard; van den Oord, Stijn C H; Schinkel, Arend F L; de Jong, Nico; van der Steen, Antonius F W; Bosch, Johan G

    2015-01-01

    Intraplaque neovascularization (IPN) is an important biomarker of atherosclerotic plaque vulnerability. As IPN can be detected by contrast enhanced ultrasound (CEUS), imaging-biomarkers derived from CEUS may allow early prediction of plaque vulnerability. To select the best quantitative imaging-biomarkers for prediction of plaque vulnerability, a systematic analysis of IPN with existing and new analysis algorithms is necessary. Currently available commercial contrast quantification tools are not applicable for quantitative analysis of carotid IPN due to substantial motion of the carotid artery, artifacts, and intermittent perfusion of plaques. We therefore developed a specialized software package called Carotid intraplaque neovascularization quantification software (CINQS). It was designed for effective and systematic comparison of sets of quantitative imaging biomarkers. CINQS includes several analysis algorithms for carotid IPN quantification and overcomes the limitations of current contrast quantification tools and existing carotid IPN quantification approaches. CINQS has a modular design which allows integrating new analysis tools. Wizard-like analysis tools and its graphical-user-interface facilitate its usage. In this paper, we describe the concept, analysis tools, and performance of CINQS and present analysis results of 45 plaques of 23 patients. The results in 45 plaques showed excellent agreement with visual IPN scores for two quantitative imaging-biomarkers (The area under the receiver operating characteristic curve was 0.92 and 0.93). PMID:25561454

  4. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  5. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  6. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  7. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  8. Sulphite quantification on damaged stones and mortars

    NASA Astrophysics Data System (ADS)

    Gobbi, G.; Zappia, G.; Sabbioni, C.

    An analytical procedure was developed for the simultaneous identification and quantification of the sulphite and main anions found in degradation patinas on historic buildings and monuments, as well as on stones and mortars exposed in simulation chamber and field tests. The quantification of anions was performed by means of ion chromatography (IC), after the stabilisation of sulphite with a D(-) fructose solution. The utilisation of two different chromatographic columns, connected in series, allowed the simultaneous determination of fluoride, acetate, formate, chloride, nitrite, bromide, iodide, oxyhalides, nitrate, phosphate, sulphite, sulphate and oxalate, in a time of approximately 25 min, without interference and with high reproducibility. Finally, the results show how in the majority of cases the formation of sulphite is an intermediate stage in the sulphation process affecting building materials exposed to the environment and needs to be measured together with sulphate, in order to obtain a correct interpretation of degradation mechanisms on such materials.

  9. Whitepaper on Uncertainty Quantification for MPACT

    SciTech Connect

    Williams, Mark L.

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  10. Automated quantification of synapses by fluorescence microscopy.

    PubMed

    Schätzle, Philipp; Wuttke, René; Ziegler, Urs; Sonderegger, Peter

    2012-02-15

    The quantification of synapses in neuronal cultures is essential in studies of the molecular mechanisms underlying synaptogenesis and synaptic plasticity. Conventional counting of synapses based on morphological or immunocytochemical criteria is extremely work-intensive. We developed a fully automated method which quantifies synaptic elements and complete synapses based on immunocytochemistry. Pre- and postsynaptic elements are detected by their corresponding fluorescence signals and their proximity to dendrites. Synapses are defined as the combination of a pre- and postsynaptic element within a given distance. The analysis is performed in three dimensions and all parameters required for quantification can be easily adjusted by a graphical user interface. The integrated batch processing enables the analysis of large datasets without any further user interaction and is therefore efficient and timesaving. The potential of this method was demonstrated by an extensive quantification of synapses in neuronal cultures from DIV 7 to DIV 21. The method can be applied to all datasets containing a pre- and postsynaptic labeling plus a dendritic or cell surface marker.

  11. Protein quantification using a cleavable reporter peptide.

    PubMed

    Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno

    2015-02-01

    Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.

  12. Protein quantification using a cleavable reporter peptide.

    PubMed

    Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno

    2015-02-01

    Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described. PMID:25411902

  13. Numerical approach for quantification of epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Jakeman, John; Eldred, Michael; Xiu, Dongbin

    2010-06-01

    In the field of uncertainty quantification, uncertainty in the governing equations may assume two forms: aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty can be characterised by known probability distributions whilst epistemic uncertainty arises from a lack of knowledge of probabilistic information. While extensive research efforts have been devoted to the numerical treatment of aleatory uncertainty, little attention has been given to the quantification of epistemic uncertainty. In this paper, we propose a numerical framework for quantification of epistemic uncertainty. The proposed methodology does not require any probabilistic information on uncertain input parameters. The method only necessitates an estimate of the range of the uncertain variables that encapsulates the true range of the input variables with overwhelming probability. To quantify the epistemic uncertainty, we solve an encapsulation problem, which is a solution to the original governing equations defined on the estimated range of the input variables. We discuss solution strategies for solving the encapsulation problem and the sufficient conditions under which the numerical solution can serve as a good estimator for capturing the effects of the epistemic uncertainty. In the case where probability distributions of the epistemic variables become known a posteriori, we can use the information to post-process the solution and evaluate solution statistics. Convergence results are also established for such cases, along with strategies for dealing with mixed aleatory and epistemic uncertainty. Several numerical examples are presented to demonstrate the procedure and properties of the proposed methodology.

  14. Automated Template Quantification for DNA Sequencing Facilities

    PubMed Central

    Ivanetich, Kathryn M.; Yan, Wilson; Wunderlich, Kathleen M.; Weston, Jennifer; Walkup, Ward G.; Simeon, Christian

    2005-01-01

    The quantification of plasmid DNA by the PicoGreen dye binding assay has been automated, and the effect of quantification of user-submitted templates on DNA sequence quality in a core laboratory has been assessed. The protocol pipets, mixes and reads standards, blanks and up to 88 unknowns, generates a standard curve, and calculates template concentrations. For pUC19 replicates at five concentrations, coefficients of variance were 0.1, and percent errors were from 1% to 7% (n = 198). Standard curves with pUC19 DNA were nonlinear over the 1 to 1733 ng/μL concentration range required to assay the majority (98.7%) of user-submitted templates. Over 35,000 templates have been quantified using the protocol. For 1350 user-submitted plasmids, 87% deviated by ≥ 20% from the requested concentration (500 ng/μL). Based on data from 418 sequencing reactions, quantification of user-submitted templates was shown to significantly improve DNA sequence quality. The protocol is applicable to all types of double-stranded DNA, is unaffected by primer (1 pmol/μL), and is user modifiable. The protocol takes 30 min, saves 1 h of technical time, and costs approximately $0.20 per unknown. PMID:16461949

  15. QUANTIFICATION OF TISSUE PROPERTIES IN SMALL VOLUMES

    SciTech Connect

    J. MOURANT; ET AL

    2000-12-01

    The quantification of tissue properties by optical measurements will facilitate the development of noninvasive methods of cancer diagnosis and detection. Optical measurements are sensitive to tissue structure which is known to change during tumorigenesis. The goals of the work presented in this paper were to verify that the primary scatterers of light in cells are structures much smaller than the nucleus and then to develop an optical technique that can quantify parameters of structures the same size as the scattering features in cells. Polarized, elastic back-scattering was found to be able to quantify changes in scattering properties for turbid media consisting of scatterers of the size found in tissue.

  16. Tutorial examples for uncertainty quantification methods.

    SciTech Connect

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  17. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  18. Virus detection and quantification using electrical parameters

    PubMed Central

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-01-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles. PMID:25355078

  19. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics.

  20. Simple quantification of in planta fungal biomass.

    PubMed

    Ayliffe, Michael; Periyannan, Sambasivam K; Feechan, Angela; Dry, Ian; Schumann, Ulrike; Lagudah, Evans; Pryor, Anthony

    2014-01-01

    An accurate assessment of the disease resistance status of plants to fungal pathogens is an essential requirement for the development of resistant crop plants. Many disease resistance phenotypes are partial rather than obvious immunity and are frequently scored using subjective qualitative estimates of pathogen development or plant disease symptoms. Here we report a method for the accurate comparison of total fungal biomass in plant tissues. This method, called the WAC assay, is based upon the specific binding of the plant lectin wheat germ agglutinin to fungal chitin. The assay is simple, high-throughput, and sensitive enough to discriminate between single Puccinia graminis f.sp tritici infection sites on a wheat leaf segment. It greatly lends itself to replication as large volumes of tissue can be pooled from independent experiments and assayed to provide truly representative quantification, or, alternatively, fungal growth on a single, small leaf segment can be quantified. In addition, as the assay is based upon a microscopic technique, pathogen infection sites can also be examined at high magnification prior to quantification if desired and average infection site areas are determined. Previously, we have demonstrated the application of the WAC assay for quantifying the growth of several different pathogen species in both glasshouse grown material and large-scale field plots. Details of this method are provided within.

  1. Virus detection and quantification using electrical parameters

    NASA Astrophysics Data System (ADS)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  2. Quantification of ontogenetic allometry in ammonoids.

    PubMed

    Korn, Dieter

    2012-01-01

    Ammonoids are well-known objects used for studies on ontogeny and phylogeny, but a quantification of ontogenetic change has not yet been carried out. Their planispirally coiled conchs allow for a study of "longitudinal" ontogenetic data, that is data of ontogenetic trajectories that can be obtained from a single specimen. Therefore, they provide a good model for ontogenetic studies of geometry in other shelled organisms. Using modifications of three cardinal conch dimensions, computer simulations can model artificial conchs. The trajectories of ontogenetic allometry of these simulations can be analyzed in great detail in a theoretical morphospace. A method for the classification of conch ontogeny and quantification of the degree of allometry is proposed. Using high-precision cross-sections, the allometric conch growth of real ammonoids can be documented and compared. The members of the Ammonoidea show a wide variety of allometric growth, ranging from near isometry to monophasic, biphasic, or polyphasic allometry. Selected examples of Palaeozoic and Mesozoic ammonoids are shown with respect to their degree of change during ontogeny of the conch.

  3. Quantification of abdominal aortic deformation after EVAR

    NASA Astrophysics Data System (ADS)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  4. Quantification of ontogenetic allometry in ammonoids.

    PubMed

    Korn, Dieter

    2012-01-01

    Ammonoids are well-known objects used for studies on ontogeny and phylogeny, but a quantification of ontogenetic change has not yet been carried out. Their planispirally coiled conchs allow for a study of "longitudinal" ontogenetic data, that is data of ontogenetic trajectories that can be obtained from a single specimen. Therefore, they provide a good model for ontogenetic studies of geometry in other shelled organisms. Using modifications of three cardinal conch dimensions, computer simulations can model artificial conchs. The trajectories of ontogenetic allometry of these simulations can be analyzed in great detail in a theoretical morphospace. A method for the classification of conch ontogeny and quantification of the degree of allometry is proposed. Using high-precision cross-sections, the allometric conch growth of real ammonoids can be documented and compared. The members of the Ammonoidea show a wide variety of allometric growth, ranging from near isometry to monophasic, biphasic, or polyphasic allometry. Selected examples of Palaeozoic and Mesozoic ammonoids are shown with respect to their degree of change during ontogeny of the conch. PMID:23134208

  5. An improved competitive inhibition enzymatic immunoassay method for tetrodotoxin quantification.

    PubMed

    Stokes, Amber N; Williams, Becky L; French, Susannah S

    2012-01-01

    Quantifying tetrodotoxin (TTX) has been a challenge in both ecological and medical research due to the cost, time and training required of most quantification techniques. Here we present a modified Competitive Inhibition Enzymatic Immunoassay for the quantification of TTX, and to aid researchers in the optimization of this technique for widespread use with a high degree of accuracy and repeatability.

  6. Survey and Evaluate Uncertainty Quantification Methodologies

    SciTech Connect

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  7. Feature isolation and quantification of evolving datasets

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Identifying and isolating features is an important part of visualization and a crucial step for the analysis and understanding of large time-dependent data sets (either from observation or simulation). In this proposal, we address these concerns, namely the investigation and implementation of basic 2D and 3D feature based methods to enhance current visualization techniques and provide the building blocks for automatic feature recognition, tracking, and correlation. These methods incorporate ideas from scientific visualization, computer vision, image processing, and mathematical morphology. Our focus is in the area of fluid dynamics, and we show the applicability of these methods to the quantification and tracking of three-dimensional vortex and turbulence bursts.

  8. Poliovirus: Generation, Quantification, Propagation, Purification, and Storage

    PubMed Central

    Burrill, Cecily P.; Strings, Vanessa R.; Andino, Raul

    2016-01-01

    Poliovirus (PV) is the prototypical picornavirus. It is a non-enveloped RNA virus with a small (~7.5 kb) genome of positive polarity. It has long served as a model to study RNA virus biology, pathogenesis, and evolution. cDNA clones of several strains are available, and infectious virus can be produced by the transfection of in vitro transcribed viral genomes into an appropriate host cell. PV infects many human and non-human primate cell lines including HeLa and HeLa S3 cells, and can grow to high titer in culture. Protocols for the production, propagation, quantification, and purification of PV are presented. A separate chapter concerning the generation and characterization of PV mutants will also be presented. PMID:23686830

  9. Uncertainty quantification in DIC with Kriging regression

    NASA Astrophysics Data System (ADS)

    Wang, Dezhi; DiazDelaO, F. A.; Wang, Weizhuo; Lin, Xiaoshan; Patterson, Eann A.; Mottershead, John E.

    2016-03-01

    A Kriging regression model is developed as a post-processing technique for the treatment of measurement uncertainty in classical subset-based Digital Image Correlation (DIC). Regression is achieved by regularising the sample-point correlation matrix using a local, subset-based, assessment of the measurement error with assumed statistical normality and based on the Sum of Squared Differences (SSD) criterion. This leads to a Kriging-regression model in the form of a Gaussian process representing uncertainty on the Kriging estimate of the measured displacement field. The method is demonstrated using numerical and experimental examples. Kriging estimates of displacement fields are shown to be in excellent agreement with 'true' values for the numerical cases and in the experimental example uncertainty quantification is carried out using the Gaussian random process that forms part of the Kriging model. The root mean square error (RMSE) on the estimated displacements is produced and standard deviations on local strain estimates are determined.

  10. Quantification of Condylar Resorption in TMJ Osteoarthritis

    PubMed Central

    Cevidanes, LHS; Hajati, A-K; Paniagua, B; Lim, PF; Walker, DG; Palconet, G; Nackley, AG; Styner, M; Ludlow, JB; Zhu, H; Phillips, C

    2010-01-01

    OBJECTIVE This study was performed to determine the condylar morphological variation of osteoarthritic (OA) and asymptomatic temporomandibular joints (TMJ) and to determine its correlation with pain intensity and duration. STUDY DESIGN Three dimensional surface models of mandibular condyles were constructed from Cone-Beam CT images of 29 female patients with TMJ OA (Research Diagnostic Criteria for Temporomandibular Disorders Group III) and 36 female asymptomatic subjects. Shape Correspondence was used to localize and quantify the condylar morphology. Statistical analysis was performed with MANCOVA analysis using Hotelling T2 metric based on covariance matrices, and Pearson correlation. RESULTS OA condylar morphology was statistically significantly different from the asymptomatic condyles (p<0.05). 3D morphological variation of the OA condyles was significantly correlated with pain intensity and duration. CONCLUSION 3D quantification of condylar morphology revealed profound differences between OA and asymptomatic condyles and the extent of the resorptive changes paralleled pain severity and duration. PMID:20382043

  11. Quantification of adipose tissue insulin sensitivity.

    PubMed

    Søndergaard, Esben; Jensen, Michael D

    2016-06-01

    In metabolically healthy humans, adipose tissue is exquisitely sensitive to insulin. Similar to muscle and liver, adipose tissue lipolysis is insulin resistant in adults with central obesity and type 2 diabetes. Perhaps uniquely, however, insulin resistance in adipose tissue may directly contribute to development of insulin resistance in muscle and liver because of the increased delivery of free fatty acids to those tissues. It has been hypothesized that insulin adipose tissue resistance may precede other metabolic defects in obesity and type 2 diabetes. Therefore, precise and reproducible quantification of adipose tissue insulin sensitivity, in vivo, in humans, is an important measure. Unfortunately, no consensus exists on how to determine adipose tissue insulin sensitivity. We review the methods available to quantitate adipose tissue insulin sensitivity and will discuss their strengths and weaknesses.

  12. Recurrence quantification analysis of global stock markets

    NASA Astrophysics Data System (ADS)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  13. Quantification of Glutathione in Caenorhabditis elegans

    PubMed Central

    Caito, Samuel W.; Aschner, Michael

    2015-01-01

    Glutathione (GSH) is the most abundant intracellular thiol with diverse functions from redox signaling, xenobiotic detoxification, and apoptosis. The quantification of GSH is an important measure for redox capacity and oxidative stress. This protocol quantifies total GSH from Caenorhabditis elegans, an emerging model organism for toxicology studies. GSH is measured using the 5,5′-dithiobis-(2-nitrobenzoic acid) (DTNB) cycling method originally created for cell and tissue samples but optimized for whole worm extracts. DTNB reacts with GSH to from a 5′-thio-2-nitrobenzoic acid (TNB) chromophore with maximum absorbance of 412 nm. This method is both rapid and sensitive, making it ideal for studies involving a large number of transgenic nematode strains. PMID:26309452

  14. Carotenoid Extraction and Quantification from Capsicum annuum

    PubMed Central

    Richins, Richard D.; Kilcrease, James; Rodgriguez-Uribe, Laura; O'Connell, Mary A.

    2016-01-01

    Carotenoids are ubiquitous pigments that play key roles in photosynthesis and also accumulate to high levels in fruit and flowers. Specific carotenoids play essential roles in human health as these compounds are precursors for Vitamin A; other specific carotenoids are important sources of macular pigments and all carotenoids are important anti-oxidants. Accurate determination of the composition and concentration of this complex set of natural products is therefore important in many different scientific areas. One of the richest sources of these compounds is the fruit of Capsicum; these red, yellow and orange fruit accumulate multiple carotenes and xanthophylls. This report describes the detailed method for the extraction and quantification of specific carotenes and xanthophylls. PMID:27570797

  15. [Quantification of motor activity in biomedicine].

    PubMed

    Giannazzo, E

    1993-01-01

    A computer-assisted analysis of motor activity was carried out using ultrasound waves, that are not invasive and free from any kind of interference, because of their specific characteristics. We worked out the Doppler's effect which determines a frequency variation on the reflected wave from any body in motion. That variation is linked to the velocity of the moving body and the superimposition of the emitted wave with those reflected, results in beats, which have a frequency proportional to the motor activity velocity. Our research group planned and carried out an electronic quantification apparatus that can be interfaced with a personal computer system by means of an Analog to Digital acquisition card. The performed test on the apparatus confirmed that the theory that the number of antinodes detected was proportional to the space covered by the moving body. The equipment was also tested on several types of animals.

  16. Quantification of uncertainty in geochemical reactions

    NASA Astrophysics Data System (ADS)

    Srinivasan, Gowri; Tartakovsky, Daniel M.; Robinson, Bruce A.; Aceves, Alejandro B.

    2007-12-01

    Predictions of reactive transport in the subsurface are routinely compromised by both model (structural) and parametric uncertainties. We present a set of computational tools for quantifying these two types of uncertainties. The model uncertainty is resolved at the molecular scale where epistemic uncertainty incorporates aleatory uncertainty. The parametric uncertainty is resolved at both molecular and continuum (Darcy) scales. We use the proposed approach to quantify uncertainty in modeling the sorption of neptunium through a competitive ion exchange. This radionuclide is of major concern for various high-level waste storage projects because of its relatively long half-life and its high-solubility and low-sorption properties. We demonstrate how parametric and model uncertainties affect one's ability to estimate the distribution coefficient. The uncertainty quantification tools yield complete probabilistic descriptions of key parameters affecting the fate and migration of neptunium in the subsurface rather than the lower statistical moments. This is important, since these distributions are highly skewed.

  17. Kinetic quantification of plyometric exercise intensity.

    PubMed

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program. PMID:22080319

  18. Quantification of heterogeneity observed in medical images

    PubMed Central

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. Methods In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. Results We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. Conclusions These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity. PMID:23453000

  19. Uncertainty Quantification of Equilibrium Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Brandon, S. T.; Covey, C. C.; Domyancic, D. M.; Johannesson, G.; Klein, R.; Tannahill, J.; Zhang, Y.

    2011-12-01

    Significant uncertainties exist in the temperature response of the climate system to changes in the levels of atmospheric carbon dioxide. We report progress to quantify the uncertainties of equilibrium climate sensitivity using perturbed parameter ensembles of the Community Earth System Model (CESM). Through a strategic initiative at the Lawrence Livermore National Laboratory, we have been developing uncertainty quantification (UQ) methods and incorporating them into a software framework called the UQ Pipeline. We have applied this framework to generate a large number of ensemble simulations using Latin Hypercube and other schemes to sample up to three dozen uncertain parameters in the atmospheric (CAM) and sea ice (CICE) model components of CESM. The parameters sampled are related to many highly uncertain processes, including deep and shallow convection, boundary layer turbulence, cloud optical and microphysical properties, and sea ice albedo. An extensive ensemble database comprised of more than 46,000 simulated climate-model-years of recent climate conditions has been assembled. This database is being used to train surrogate models of CESM responses and to perform statistical calibrations of the CAM and CICE models given observational data constraints. The calibrated models serve as a basis for propagating uncertainties forward through climate change simulations using a slab ocean model configuration of CESM. This procedure is being used to quantify the probability density function of equilibrium climate sensitivity accounting for uncertainties in climate model processes. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013. (LLNL-ABS-491765)

  20. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  1. Eine Übersicht zu Methoden und Anwendungen der Validierung von Vulnerabilitätsbewertungen

    NASA Astrophysics Data System (ADS)

    Neukum, Christoph

    2013-03-01

    Groundwater vulnerability maps have been applied over the past several decades for assessing groundwater sensitivity to pollution. Many different methods with various approaches and associated information content have been developed over the years. However, application of different methods to the same areas may lead to different or even contradictive results that may render vulnerability mapping unreliable. This manuscript presents a selection of methods that have been applied to validate vulnerability mapping approaches with different boundary conditions at various scales. The validation approaches are explained and their advantages and disadvantages are discussed. A key result is that validation is an important part of vulnerability mapping and that it contributes to a sound interpretation.

  2. Integration der bodenkundlichen Filter- und Pufferfunktion in die hydrogeologische Vulnerabilitätsbewertung

    NASA Astrophysics Data System (ADS)

    Wirsing, Tobias; Neukum, Christoph; Goldscheider, Nico; Maier, Matthias

    2015-06-01

    Vulnerability maps are standard tools for the assessment of groundwater sensitivity to contamination. Due to their increased use in technical guidelines, vulnerability maps have become state-of-the-art tools in resource management. However, most approaches have been developed by hydrogeologists and soil scientists who incorporate the understanding of processes from their specific disciplines very well but have limitations in considering processes in other disciplines. A soil-specific database for vulnerability assessment has been significantly improved by soil scientists over the past several years and includes quality, spatial extension and availability. Hence, it is time to integrate this database into hydrogeological concepts. This work presents a vulnerability mapping approach that considers a new soil database that has been available since 2014 for the entire Baden-Württemberg region at a scale of 1:50.000, adapting the well-established GLA and PI methods. Due to the newly-developed classification scheme for the protective function, this approach provides a more balanced and meaningful classification. This leads to a distinct image of the study area and a better interpretation of vulnerability.

  3. Erratum: Erratum zu: Integration der bodenkundlichen Filter- und Pufferfunktion in die hydrogeologische Vulnerabilitätsbewertung

    NASA Astrophysics Data System (ADS)

    Wirsing, Tobias; Neukum, Christoph; Goldscheider, Nico; Maier, Matthias

    2015-09-01

    Vulnerability maps are standard tools for the assessment of groundwater sensitivity to contamination. Due to their increased use in technical guidelines, vulnerability maps have become state-of-the-art tools in resource management. However, most approaches have been developed by hydrogeologists and soil scientists who incorporate the understanding of processes from their specific disciplines very well but have limitations in considering processes in other disciplines. A soil-specific database for vulnerability assessment has been significantly improved by soil scientists over the past several years and includes quality, spatial extension and availability. Hence, it is time to integrate this database into hydrogeological concepts. This work presents a vulnerability mapping approach that considers a new soil database that has been available since 2014 for the entire Baden-Württemberg region at a scale of 1:50.000, adapting the well-established GLA and PI methods. Due to the newly-developed classification scheme for the protective function, this approach provides a more balanced and meaningful classification. This leads to a distinct image of the study area and a better interpretation of vulnerability.

  4. Software-assisted serum metabolite quantification using NMR.

    PubMed

    Jung, Young-Sang; Hyeon, Jin-Seong; Hwang, Geum-Sook

    2016-08-31

    The goal of metabolomics is to analyze a whole metabolome under a given set of conditions, and accurate and reliable quantitation of metabolites is crucial. Absolute concentration is more valuable than relative concentration; however, the most commonly used method in NMR-based serum metabolic profiling, bin-based and full data point peak quantification, provides relative concentration levels of metabolites and are not reliable when metabolite peaks overlap in a spectrum. In this study, we present the software-assisted serum metabolite quantification (SASMeQ) method, which allows us to identify and quantify metabolites in NMR spectra using Chenomx software. This software uses the ERETIC2 utility from TopSpin to add a digitally synthesized peak to a spectrum. The SASMeQ method will advance NMR-based serum metabolic profiling by providing an accurate and reliable method for absolute quantification that is superior to bin-based quantification. PMID:27506360

  5. Uncertainty quantification in reacting flow modeling.

    SciTech Connect

    Le MaÒitre, Olivier P.; Reagan, Matthew T.; Knio, Omar M.; Ghanem, Roger Georges; Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  6. Shape regression for vertebra fracture quantification

    NASA Astrophysics Data System (ADS)

    Lund, Michael Tillge; de Bruijne, Marleen; Tanko, Laszlo B.; Nielsen, Mads

    2005-04-01

    Accurate and reliable identification and quantification of vertebral fractures constitute a challenge both in clinical trials and in diagnosis of osteoporosis. Various efforts have been made to develop reliable, objective, and reproducible methods for assessing vertebral fractures, but at present there is no consensus concerning a universally accepted diagnostic definition of vertebral fractures. In this project we want to investigate whether or not it is possible to accurately reconstruct the shape of a normal vertebra, using a neighbouring vertebra as prior information. The reconstructed shape can then be used to develop a novel vertebra fracture measure, by comparing the segmented vertebra shape with its reconstructed normal shape. The vertebrae in lateral x-rays of the lumbar spine were manually annotated by a medical expert. With this dataset we built a shape model, with equidistant point distribution between the four corner points. Based on the shape model, a multiple linear regression model of a normal vertebra shape was developed for each dataset using leave-one-out cross-validation. The reconstructed shape was calculated for each dataset using these regression models. The average prediction error for the annotated shape was on average 3%.

  7. Classification and quantification of leaf curvature

    PubMed Central

    Liu, Zhongyuan; Jia, Liguo; Mao, Yanfei; He, Yuke

    2010-01-01

    Various mutants of Arabidopsis thaliana deficient in polarity, cell division, and auxin response are characterized by certain types of leaf curvature. However, comparison of curvature for clarification of gene function can be difficult without a quantitative measurement of curvature. Here, a novel method for classification and quantification of leaf curvature is reported. Twenty-two mutant alleles from Arabidopsis mutants and transgenic lines deficient in leaf flatness were selected. The mutants were classified according to the direction, axis, position, and extent of leaf curvature. Based on a global measure of whole leaves and a local measure of four regions in the leaves, the curvature index (CI) was proposed to quantify the leaf curvature. The CI values accounted for the direction, axis, position, and extent of leaf curvature in all of the Arabidopsis mutants grown in growth chambers. Comparison of CI values between mutants reveals the spatial and temporal variations of leaf curvature, indicating the strength of the mutant alleles and the activities of the corresponding genes. Using the curvature indices, the extent of curvature in a complicated genetic background becomes quantitative and comparable, thus providing a useful tool for defining the genetic components of leaf development and to breed new varieties with leaf curvature desirable for the efficient capture of sunlight for photosynthesis and high yields. PMID:20400533

  8. Quantification of moving target cyber defenses

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; Cybenko, George

    2015-05-01

    Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.

  9. Detection of aneuploidies by paralogous sequence quantification

    PubMed Central

    Deutsch, S; Choudhury, U; Merla, G; Howald, C; Sylvan, A; Antonarakis, S

    2004-01-01

    Background: Chromosomal aneuploidies are a common cause of congenital disorders associated with cognitive impairment and multiple dysmorphic features. Pre-natal diagnosis of aneuploidies is most commonly performed by the karyotyping of fetal cells obtained by amniocentesis or chorionic villus sampling, but this method is labour intensive and requires about 14 days to complete. Methods: We have developed a PCR based method for the detection of targeted chromosome number abnormalities termed paralogous sequence quantification (PSQ), based on the use of paralogous genes. Paralogous sequences have a high degree of sequence identity, but accumulate nucleotide substitutions in a locus specific manner. These sequence differences, which we term paralogous sequence mismatches (PSMs), can be quantified using pyrosequencing technology, to estimate the relative dosage between different chromosomes. We designed 10 assays for the detection of trisomies of chromosomes 13, 18, and 21 and sex chromosome aneuploidies. Results: We evaluated the performance of this method on 175 DNAs, highly enriched for abnormal samples. A correct and unambiguous diagnosis was given for 119 out of 120 aneuploid samples as well as for all the controls. One sample which gave an intermediate value for the chromosome 13 assays could not be diagnosed. Conclusions: Our data suggests that PSQ is a robust, easy to interpret, and easy to set up method for the diagnosis of common aneuploidies, and can be performed in less than 48 h, representing a competitive alternative for widespread use in diagnostic laboratories. PMID:15591276

  10. Quantification of the vocal folds’ dynamic displacements

    NASA Astrophysics Data System (ADS)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  11. Quantification of biological aging in young adults

    PubMed Central

    Belsky, Daniel W.; Caspi, Avshalom; Houts, Renate; Cohen, Harvey J.; Corcoran, David L.; Danese, Andrea; Harrington, HonaLee; Israel, Salomon; Levine, Morgan E.; Schaefer, Jonathan D.; Sugden, Karen; Williams, Ben; Yashin, Anatoli I.; Poulton, Richie; Moffitt, Terrie E.

    2015-01-01

    Antiaging therapies show promise in model organism research. Translation to humans is needed to address the challenges of an aging global population. Interventions to slow human aging will need to be applied to still-young individuals. However, most human aging research examines older adults, many with chronic disease. As a result, little is known about aging in young humans. We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their “biological aging” (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older. Measured biological aging in young adults can be used to identify causes of aging and evaluate rejuvenation therapies. PMID:26150497

  12. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  13. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  14. Quality Quantification of Evaluated Cross Section Covariances

    SciTech Connect

    Varet, S.; Dossantos-Uzarralde, P.

    2015-01-15

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the {sup 85}Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations.

  15. Quantification of perceived macro-uniformity

    NASA Astrophysics Data System (ADS)

    Lee, Ki-Youn; Bang, Yousun; Choh, Heui-Keun

    2011-01-01

    Macro-uniformity refers to the subjective impression of overall uniformity in the print sample. By the efforts of INCITS W1.1 team, macro-uniformity is categorized into five types of attributes: banding, streaks, mottle, gradients, and moiré patterns, and the ruler samples are generated with perceptual scales. W1.1 macro-uniformity ruler is useful for judging the levels of print defect, but it is not an easy task to reproduce the samples having the same perceptual scales at different times in different places. An objective quantification method is more helpful and convenient for developers to analyze print quality and design printing system components. In this paper, we propose a method for measuring perceived macro-uniformity for a given print using a flat-bed scanner. First, banding, 2D noise, and gradients are separately measured, and they are converted to the perceptual scales based on subjective results of each attribute. The correlation coefficients between the measured values of the attributes and the perceptual scales are 0.92, 0.97, and 0.86, respectively. Another subjective test is performed to find the relationship between the overall macro-uniformity and the three attributes. The weighting factors are obtained by the experimental result, and the final macro-uniformity grade is determined by the weighted sums of each attribute.

  16. Quantification of bromophenols in Islay whiskies.

    PubMed

    Bendig, Paul; Lehnert, Katja; Vetter, Walter

    2014-04-01

    Two single malt whiskies from the Scottish island Islay, i.e., Laphroiag and Lagavulin, are characterized by an iodine-like flavor associated with marine environments. In this study we investigated if this flavor impression could be due to bromophenols which are character impact compounds of marine fish and shrimps. In this study we developed a method suited for the determination of dibromo- and tribromophenols in whisky. Aliquots were O-acetylated, and quantification was carried out with gas chromatography with electron-capture negative ion mass spectrometry (GC/ECNI-MS). Both Islay whiskies contained more than 400 ng/L bromophenols with 2,6-dibromophenol being the most relevant homologue (>300 ng/L, respectively). These concentrations are at least 1 order of magnitude higher than the taste threshold of 2,6-dibromophenol in water. A third Islay whisky, Bowmore, contained ∼100 ng/L bromophenols while seventeen other whiskies from other regions in Scotland as well as from the USA, Ireland, and Germany contained at least 1 order of magnitude less than the two whiskies with the marine taste. Accordingly, bromophenols may contribute to the marine flavor and taste of Laphroaig and Lagavulin.

  17. Legionella spp. isolation and quantification from greywater

    PubMed Central

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  18. Legionella spp. isolation and quantification from greywater.

    PubMed

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample.

  19. Uncertainty quantification for systems of conservation laws

    SciTech Connect

    Poette, Gael Despres, Bruno Lucor, Didier

    2009-04-20

    Uncertainty quantification through stochastic spectral methods has been recently applied to several kinds of non-linear stochastic PDEs. In this paper, we introduce a formalism based on kinetic theory to tackle uncertain hyperbolic systems of conservation laws with Polynomial Chaos (PC) methods. The idea is to introduce a new variable, the entropic variable, in bijection with our vector of unknowns, which we develop on the polynomial basis: by performing a Galerkin projection, we obtain a deterministic system of conservation laws. We state several properties of this deterministic system in the case of a general uncertain system of conservation laws. We then apply the method to the case of the inviscid Burgers' equation with random initial conditions and we present some preliminary results for the Euler system. We systematically compare results from our new approach to results from the stochastic Galerkin method. In the vicinity of discontinuities, the new method bounds the oscillations due to Gibbs phenomenon to a certain range through the entropy of the system without the use of any adaptative random space discretizations. It is found to be more precise than the stochastic Galerkin method for smooth cases but above all for discontinuous cases.

  20. Uncertainty Quantification of Modelling of Equiaxed Solidification

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2016-07-01

    Numerical simulations of metal alloy solidification are used to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to sparse experimental data, to which agreement can be misinterpreted due to both model and experimental uncertainty. Uncertainty quantification (UQ) and sensitivity analysis are performed on a transient model of solidification of Al-4.5 wt.% Cu in a rectangular cavity, with equiaxed (grain refined) solidification morphology. This model solves equations for momentum, temperature, and species conservation; UQ and sensitivity analysis are performed for the degree of macrosegregation. A Smolyak sparse grid algorithm is used to select input values to construct a response surface fit to model outputs. The response surface is then used as a surrogate for the solidification model to determine the sensitivities and probability density functions of the model outputs. Uncertain model inputs of interest include the secondary dendrite arm spacing, equiaxed particle size, and fraction solid at which the rigid mushy zone forms. Similar analysis was also performed on a transient model of direct chill casting of the same alloy.

  1. Quantification and Propagation of Nuclear Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Rising, Michael E.

    The use of several uncertainty quantification and propagation methodologies is investigated in the context of the prompt fission neutron spectrum (PFNS) uncertainties and its impact on critical reactor assemblies. First, the first-order, linear Kalman filter is used as a nuclear data evaluation and uncertainty quantification tool combining available PFNS experimental data and a modified version of the Los Alamos (LA) model. The experimental covariance matrices, not generally given in the EXFOR database, are computed using the GMA methodology used by the IAEA to establish more appropriate correlations within each experiment. Then, using systematics relating the LA model parameters across a suite of isotopes, the PFNS for both the uranium and plutonium actinides are evaluated leading to a new evaluation including cross-isotope correlations. Next, an alternative evaluation approach, the unified Monte Carlo (UMC) method, is studied for the evaluation of the PFNS for the n(0.5 MeV)+Pu-239 fission reaction and compared to the Kalman filter. The UMC approach to nuclear data evaluation is implemented in a variety of ways to test convergence toward the Kalman filter results and to determine the nonlinearities present in the LA model. Ultimately, the UMC approach is shown to be comparable to the Kalman filter for a realistic data evaluation of the PFNS and is capable of capturing the nonlinearities present in the LA model. Next, the impact that the PFNS uncertainties have on important critical assemblies is investigated. Using the PFNS covariance matrices in the ENDF/B-VII.1 nuclear data library, the uncertainties of the effective multiplication factor, leakage, and spectral indices of the Lady Godiva and Jezebel critical assemblies are quantified. Using principal component analysis on the PFNS covariance matrices results in needing only 2-3 principal components to retain the PFNS uncertainties. Then, using the polynomial chaos expansion (PCE) on the uncertain output

  2. Quantification of isotopic turnover in agricultural systems

    NASA Astrophysics Data System (ADS)

    Braun, A.; Auerswald, K.; Schnyder, H.

    2012-04-01

    The isotopic turnover, which is a proxy for the metabolic rate, is gaining scientific importance. It is quantified for an increasing range of organisms, from microorganisms over plants to animals including agricultural livestock. Additionally, the isotopic turnover is analyzed on different scales, from organs to organisms to ecosystems and even to the biosphere. In particular, the quantification of the isotopic turnover of specific tissues within the same organism, e.g. organs like liver and muscle and products like milk and faeces, has brought new insights to improve understanding of nutrient cycles and fluxes, respectively. Thus, the knowledge of isotopic turnover is important in many areas, including physiology, e.g. milk synthesis, ecology, e.g. soil retention time of water, and medical science, e.g. cancer diagnosis. So far, the isotopic turnover is quantified by applying time, cost and expertise intensive tracer experiments. Usually, this comprises two isotopic equilibration periods. A first equilibration period with a constant isotopic input signal is followed by a second equilibration period with a distinct constant isotopic input signal. This yields a smooth signal change from the first to the second signal in the object under consideration. This approach reveals at least three major problems. (i) The input signals must be controlled isotopically, which is almost impossible in many realistic cases like free ranging animals. (ii) Both equilibration periods may be very long, especially when the turnover rate of the object under consideration is very slow, which aggravates the first problem. (iii) The detection of small or slow pools is improved by large isotopic signal changes, but large isotopic changes also involve a considerable change in the input material; e.g. animal studies are usually carried out as diet-switch experiments, where the diet is switched between C3 and C4 plants, since C3 and C4 plants differ strongly in their isotopic signal. The

  3. Quantification of microvessels in canine lymph nodes.

    PubMed

    Tonar, Zbynĕk; Egger, Gunter F; Witter, Kirsti; Wolfesberger, Birgitt

    2008-10-01

    Quantification of microvessels in tumors is mostly based on counts of vessel profiles in tumor hot spots. Drawbacks of this method include low reproducibility and large interobserver variance, mainly as a result of individual differences in sampling of image fields for analysis. Our aim was to test an unbiased method for quantifying microvessels in healthy and tumorous lymph nodes of dogs. The endothelium of blood vessels was detected in paraffin sections by a combination of immunohistochemistry (von Willebrand factor) and lectin histochemistry (wheat germ agglutinin) in comparison with detection of basal laminae by laminin immunohistochemistry or silver impregnation. Systematic uniform random sampling of 50 image fields was performed during photo-documentation. An unbiased counting frame (area 113,600 microm(2)) was applied to each micrograph. The total area sampled from each node was 5.68 mm(2). Vessel profiles were counted according to stereological counting rules. Inter- and intraobserver variabilities were tested. The application of systematic uniform random sampling was compared with the counting of vessel profiles in hot spots. The unbiased estimate of the number of vessel profiles per unit area ranged from 100.5 +/- 44.0/mm(2) to 442.6 +/- 102.5/mm(2) in contrast to 264 +/- 72.2/mm(2) to 771.0 +/- 108.2/mm(2) in hot spots. The advantage of using systematic uniform random sampling is its reproducibility, with reasonable interobserver and low intraobserver variance. This method also allows for the possibility of using archival material, because staining quality is not limiting as it is for image analysis, and artifacts can easily be excluded. However, this method is comparatively time-consuming.

  4. Rapid digital quantification of microfracture populations

    NASA Astrophysics Data System (ADS)

    Gomez, Leonel A.; Laubach, Stephen E.

    2006-03-01

    Populations of microfractures are a structural fabric in many rocks deformed at upper crustal conditions. In some cases these fractures are visible in transmitted-light microscopy as fluid-inclusion planes or cement filled microfractures, but because SEM-based cathodoluminescence (CL) reveals more fractures and delineates their shapes, sizes, and crosscutting relations, it is a more effective structural tool. Yet at magnifications of 150-300×, at which many microfractures are visible, SEM-CL detectors image only small sample areas (0.5-0.1 mm 2) relative to fracture population patterns. The substantial effort required to image and measure centimeter-size areas at high-magnification has impeded quantitative study of microfractures. We present a method for efficient collection of mosaics of high-resolution CL imagery, a preparation method that allows samples to be any size while retaining continuous imagery of rock (no gaps), and software that facilitates fracture mapping and data reduction. Although the method introduced here was developed for CL imagery, it can be used with any other kind of images, including mosaics from petrographic microscopes. Compared with manual measurements, the new method increases several fold the number of microfractures imaged without a proportional increase in level of effort, increases the accuracy and repeatability of fracture measurements, and speeds quantification and display of fracture population attributes. We illustrate the method on microfracture arrays in dolostone from NE Mexico and sandstone from NW Scotland. We show that key aspects of microfracture population attributes are only fully manifest at scales larger than a single thin section.

  5. Fluorometric quantification of natural inorganic polyphosphate.

    PubMed

    Diaz, Julia M; Ingall, Ellery D

    2010-06-15

    Polyphosphate, a linear polymer of orthophosphate, is abundant in the environment and a key component in wastewater treatment and many bioremediation processes. Despite the broad relevance of polyphosphate, current methods to quantify it possess significant disadvantages. Here, we describe a new approach for the direct quantification of inorganic polyphosphate in complex natural samples. The protocol relies on the interaction between the fluorochrome 4',6-diamidino-2-phenylindole (DAPI) and dissolved polyphosphate. With the DAPI-based approach we describe, polyphosphate can be quantified at concentrations ranging from 0.5-3 microM P in a neutral-buffered freshwater matrix with an accuracy of +/-0.03 microM P. The patterns of polyphosphate concentration versus fluorescence yielded by standards exhibit no chain length dependence across polyphosphates ranging from 15-130 phosphorus units in size. Shorter length polyphosphate molecules (e.g., polyphosphate of three and five phosphorus units in length) contribute little to no signal in this approach, as these molecules react only slightly or not at all with DAPI in the concentration range tested. The presence of salt suppresses fluorescence from intermediate polyphosphate chain lengths (e.g., 15 phosphorus units) at polyphosphate concentrations ranging from 0.5-3 microM P. For longer chain lengths (e.g., 45-130 phosphorus units), this salt interference is not evident at conductivities up to approximately 10mS/cm. Our results indicate that standard polyphosphates should be stored frozen for no longer than 10-15 days to avoid inconsistent results associated with standard degradation. We have applied the fluorometric protocol to the analysis of five well-characterized natural samples to demonstrate the use of the method. PMID:20507063

  6. Designing a simple physically-based bucket SVAT model for spatialisation of water needs

    NASA Astrophysics Data System (ADS)

    Lakhal, A.; Boulet, G.; Lakhal, L.; Er-Raki, S.; Duchemin, B.; Chehbouni, G.; Timouk, F.

    2003-04-01

    Within the frame of both IRRIMED and SUDMED projects one needs a robust and simple tool to provide space-time estimates of the water requirements in flat semi-arid agricultural zones. This is the task of the simplest water balance equations, which can be seen as simple SVAT schemes. Most of the simplest SVAT schemes use the classical bucket representation of soil moisture exchange through the soil-canopy-air continuum. They usually rely on empirical relationships such as the “beta function” that are not well suited for all climate, soil and vegetation conditions. Some of them for instance greatly simplify the deep drainage parameterization, or overlook the first to second stage evaporation processes. Several authors have proposed physically-based simple expressions, such as the desorptive approach, which gives accurate integrated capillary flows under constant boundary conditions. We propose here a simple SVAT schemes that uses the same approach but reduces as much as possible the number of empirical relationships. It is tested against 1) a physically based complex SVAT scheme SiSPAT and 2) experimental data acquired during the SALSA and the SUDMED field experiments in Mexico and Morocco (respectively) for a large range of vegetation types (olive trees, wheat crop, grassland). This simple SVAT is well suited to simulate long time series of soil moisture evolution, and proves to give accurate predictions of first to second-stage evaporation time series for the bare soil and fully vegetated cover conditions. An insight into model adjustment for sparse vegetation (which usually prevails under semi-arid conditions) is proposed and partially evaluated against SiSPAT outputs.

  7. Quantification of chemical gaseous plumes on hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Niu, Sidi

    The passive remote chemical plume quantification problem may be approached from multiple aspects, corresponding to a variety of physical effects that may be exploited. Accordingly, a diversity of statistical quantification algorithms has been proposed in the literature. The ultimate performance and algorithmic complexity of each is influenced by the assumptions made about the scene, which may include the presence of ancillary measurements or particular background/plume features that may or may not be present. In this work, we evaluate and investigate the advantages and limitations of a number of quantification algorithms that span a variety of such assumptions. With these in-depth insights we gain, a new quantification algorithm is proposed for single gas quantification which is superior to all state-of-the-art algorithms in every almost every aspects including applicability, accuracy, and efficiency. The new method, called selected-band algorithm, achieves its superior performance through an accurate estimation of the unobservable off-plume radiance. The reason why off-plume radiance is recoverable relies on a common observation that most chemical gases only exhibit strong absorptive behavior in certain spectral bands. Those spectral bands where the gas absorption is almost zero or small are ideal to carry out background estimation. In this thesis, the new selected-band algorithm is first derived from its favorable narrow-band sharp-featured gas and then extended to an iterative algorithm that suits all kinds of gases. The performance improvement is verified by simulated data for a variety of experimental settings.

  8. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques. PMID:25182968

  9. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  10. Accurate Peptide Fragment Mass Analysis: Multiplexed Peptide Identification and Quantification

    PubMed Central

    Weisbrod, Chad R.; Eng, Jimmy K.; Hoopmann, Michael R.; Baker, Tahmina; Bruce, James E.

    2012-01-01

    FT All Reaction Monitoring (FT-ARM) is a novel approach for the identification and quantification of peptides that relies upon the selectivity of high mass accuracy data and the specificity of peptide fragmentation patterns. An FT-ARM experiment involves continuous, data-independent, high mass accuracy MS/MS acquisition spanning a defined m/z range. Custom software was developed to search peptides against the multiplexed fragmentation spectra by comparing theoretical or empirical fragment ions against every fragmentation spectrum across the entire acquisition. A dot product score is calculated against each spectrum in order to generate a score chromatogram used for both identification and quantification. Chromatographic elution profile characteristics are not used to cluster precursor peptide signals to their respective fragment ions. FT-ARM identifications are demonstrated to be complementary to conventional data-dependent shotgun analysis, especially in cases where the data-dependent method fails due to fragmenting multiple overlapping precursors. The sensitivity, robustness and specificity of FT-ARM quantification are shown to be analogous to selected reaction monitoring-based peptide quantification with the added benefit of minimal assay development. Thus, FT-ARM is demonstrated to be a novel and complementary data acquisition, identification, and quantification method for the large scale analysis of peptides. PMID:22288382

  11. Rapid and portable electrochemical quantification of phosphorus.

    PubMed

    Kolliopoulos, Athanasios V; Kampouris, Dimitrios K; Banks, Craig E

    2015-04-21

    Phosphorus is one of the key indicators of eutrophication levels in natural waters where it exists mainly as dissolved phosphorus. Various analytical protocols exist to provide an offsite analysis, and a point of site analysis is required. The current standard method recommended by the Environmental Protection Agency (EPA) for the detection of total phosphorus is colorimetric and based upon the color of a phosphomolybdate complex formed as a result of the reaction between orthophosphates and molybdates ions where ascorbic acid and antimony potassium tartrate are added and serve as reducing agents. Prior to the measurements, all forms of phosphorus are converted into orthophosphates via sample digestion (heating and acidifying). The work presented here details an electrochemical adaptation of this EPA recommended colorimetric approach for the measurement of dissolved phosphorus in water samples using screen-printed graphite macroelectrodes for the first time. This novel indirect electrochemical sensing protocol allows the determination of orthophosphates over the range from 0.5 to 20 μg L(-1) in ideal pH 1 solutions utilizing cyclic voltammetry with a limit of detection (3σ) found to correspond to 0.3 μg L(-1) of phosphorus. The reaction time and influence of foreign ions (potential interferents) upon this electroanalytical protocol was also investigated, where it was found that a reaction time of 5 min, which is essential in the standard colorimetric approach, is not required in the new proposed electrochemically adapted protocol. The proposed electrochemical method was independently validated through the quantification of orthophosphates and total dissolved phosphorus in polluted water samples (canal water samples) with ion chromatography and ICP-OES, respectively. This novel electrochemical protocol exhibits advantages over the established EPA recommended colorimetric determination for total phosphorus with lower detection limits and shorter experimental times

  12. Quantification of carbon nanomaterials in vivo.

    PubMed

    Wang, Haifang; Yang, Sheng-Tao; Cao, Aoneng; Liu, Yuanfang

    2013-03-19

    this Account, we review the in vivo quantification methods of carbon NMs, focusing on isotopic labeling and tracing methods, and summarize the related labeling, purification, bio-sampling, and detection of carbon NMs. We also address the advantages, applicable situations, and limits of various labeling and tracing methods and propose guidelines for choosing suitable labeling methods. A collective analysis of the ADME information on various carbon NMs in vivo would provide general principles for understanding the fate of carbon NMs and the effects of chemical functionalization and aggregation of carbon NMs on their ADME/T in vivo and their implications in nanotoxicology and biosafety evaluations.

  13. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an

  14. Rapid and portable electrochemical quantification of phosphorus.

    PubMed

    Kolliopoulos, Athanasios V; Kampouris, Dimitrios K; Banks, Craig E

    2015-04-21

    Phosphorus is one of the key indicators of eutrophication levels in natural waters where it exists mainly as dissolved phosphorus. Various analytical protocols exist to provide an offsite analysis, and a point of site analysis is required. The current standard method recommended by the Environmental Protection Agency (EPA) for the detection of total phosphorus is colorimetric and based upon the color of a phosphomolybdate complex formed as a result of the reaction between orthophosphates and molybdates ions where ascorbic acid and antimony potassium tartrate are added and serve as reducing agents. Prior to the measurements, all forms of phosphorus are converted into orthophosphates via sample digestion (heating and acidifying). The work presented here details an electrochemical adaptation of this EPA recommended colorimetric approach for the measurement of dissolved phosphorus in water samples using screen-printed graphite macroelectrodes for the first time. This novel indirect electrochemical sensing protocol allows the determination of orthophosphates over the range from 0.5 to 20 μg L(-1) in ideal pH 1 solutions utilizing cyclic voltammetry with a limit of detection (3σ) found to correspond to 0.3 μg L(-1) of phosphorus. The reaction time and influence of foreign ions (potential interferents) upon this electroanalytical protocol was also investigated, where it was found that a reaction time of 5 min, which is essential in the standard colorimetric approach, is not required in the new proposed electrochemically adapted protocol. The proposed electrochemical method was independently validated through the quantification of orthophosphates and total dissolved phosphorus in polluted water samples (canal water samples) with ion chromatography and ICP-OES, respectively. This novel electrochemical protocol exhibits advantages over the established EPA recommended colorimetric determination for total phosphorus with lower detection limits and shorter experimental times

  15. Symmetry quantification and mapping using convergent beam electron diffraction.

    PubMed

    Kim, Kyou-Hyun; Zuo, Jian-Min

    2013-01-01

    We propose a new algorithm to quantify symmetry recorded in convergent beam electron diffraction (CBED) patterns and use it for symmetry mapping in materials applications. We evaluate the effectiveness of the profile R-factor (R(p)) and the normalized cross-correlation coefficient (γ) for quantifying the amount of symmetry in a CBED pattern. The symmetry quantification procedures are automated and the algorithm is implemented as a DM (Digital Micrograph(©)) script. Experimental and simulated CBED patterns recorded from a Si single crystal are used to calibrate the proposed algorithm for the symmetry quantification. The proposed algorithm is then applied to a Si sample with defects to test the sensitivity of symmetry quantification to defects. Using the mirror symmetry as an example, we demonstrate that the normalized cross-correlation coefficient provides an effective and robust measurement of the symmetry recorded in experimental CBED patterns. PMID:23142747

  16. Real-Time PCR for Gene Expression Quantification in Asthma.

    PubMed

    Segundo-Val, Ignacio San; García-Solaesa, Virginia; García-Sánchez, Asunción

    2016-01-01

    The quantitative real-time PCR (qPCR) has become the reference technique for studying gene expression in recent years. The application of qPCR to the study of asthma provides very useful information regarding the gene expression mechanisms. The quantification of RNA from cDNA can be performed by using fluorescent dyes or specific sequence probes. Here, we describe the protocol to quantify gene expression levels using SYBR Green as fluorescent dye. The protocol starts with the RNA extraction, followed by reverse transcription to obtain cDNA, quantification and finally data analysis.

  17. Real-Time PCR for Gene Expression Quantification in Asthma.

    PubMed

    Segundo-Val, Ignacio San; García-Solaesa, Virginia; García-Sánchez, Asunción

    2016-01-01

    The quantitative real-time PCR (qPCR) has become the reference technique for studying gene expression in recent years. The application of qPCR to the study of asthma provides very useful information regarding the gene expression mechanisms. The quantification of RNA from cDNA can be performed by using fluorescent dyes or specific sequence probes. Here, we describe the protocol to quantify gene expression levels using SYBR Green as fluorescent dye. The protocol starts with the RNA extraction, followed by reverse transcription to obtain cDNA, quantification and finally data analysis. PMID:27300530

  18. Quantification of Line Tracking Solutions for Automotive Applications

    NASA Astrophysics Data System (ADS)

    Shi, Jane; Rourke, Rick F.; Groll, Dave; Tavora, Peter W.

    Unlike line tracking in automotive painting applications, line tracking for automotive general assembly applications requires position tracking in order to perform assembly operations to a required assembly tolerance. Line tracking quantification experiments have been designed and conducted for a total of 16 test cases for two line tracking scenarios with three types of line tracking solutions: encoder based tracking, encoder plus static vision based tracking, and the analog sensor-based tracking for general assembly robotic automation. This chapter presents the quantification results, identifies key performance drivers, and illustrates their implications for automotive assembly applications.

  19. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  20. Clinical PET Myocardial Perfusion Imaging and Flow Quantification.

    PubMed

    Juneau, Daniel; Erthal, Fernanda; Ohira, Hiroshi; Mc Ardle, Brian; Hessian, Renée; deKemp, Robert A; Beanlands, Rob S B

    2016-02-01

    Cardiac PET imaging is a powerful tool for the assessment of coronary artery disease. Many tracers with different advantages and disadvantages are available. It has several advantages over single photon emission computed tomography, including superior accuracy and lower radiation exposure. It provides powerful prognostic information, which can help to stratify patients and guide clinicians. The addition of flow quantification enables better detection of multivessel disease while providing incremental prognostic information. Flow quantification provides important physiologic information, which may be useful to individualize patient therapy. This approach is being applied in some centers, but requires standardization before it is more widely applied. PMID:26590781

  1. Detection and quantification of chimerism by droplet digital PCR.

    PubMed

    George, David; Czech, Juliann; John, Bobby; Yu, Min; Jennings, Lawrence J

    2013-01-01

    Accurate quantification of chimerism and microchimerism is proving to be increasingly valuable for hematopoietic cell transplantation as well as non-transplant conditions. However, methods that are available to quantify low-level chimerism lack accuracy. Therefore, we developed and validated a method for quantifying chimerism based on digital PCR technology. We demonstrate accurate quantification that far exceeds what is possible with analog qPCR down to 0.01% with the potential to go even lower. Also, this method is inherently more informative than qPCR. We expect the advantages of digital PCR will make it the preferred method for chimerism analysis.

  2. Brief review of uncertainty quantification for particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Farias, M. H.; Teixeira, R. S.; Koiller, J.; Santos, A. M.

    2016-07-01

    Metrological studies for particle image velocimetry (PIV) are recent in literature. An attempt to evaluate the uncertainty quantifications (UQ) of the PIV velocity field are in evidence. Therefore, a short review on main sources of uncertainty in PIV and available methodologies for its quantification are presented. In addition, the potential of some mathematical techniques, coming from the area of geometric mechanics and control, that could interest the fluids UQ community are highlighted as good possibilities. “We must measure what is measurable and make measurable what cannot be measured” (Galileo)

  3. A quick colorimetric method for total lipid quantification in microalgae.

    PubMed

    Byreddy, Avinesh R; Gupta, Adarsha; Barrow, Colin J; Puri, Munish

    2016-06-01

    Discovering microalgae with high lipid productivity are among the key milestones for achieving sustainable biodiesel production. Current methods of lipid quantification are time intensive and costly. A rapid colorimetric method based on sulfo-phospho-vanillin (SPV) reaction was developed for the quantification of microbial lipids to facilitate screening for lipid producing microalgae. This method was successfully tested on marine thraustochytrid strains and vegetable oils. The colorimetric method results correlated well with gravimetric method estimates. The new method was less time consuming than gravimetric analysis and is quantitative for lipid determination, even in the presence of carbohydrates, proteins and glycerol. PMID:27050419

  4. Quantification of toxicological effects for dichloromethane. Draft report (Final)

    SciTech Connect

    Not Available

    1990-04-01

    The source documents for background information used to develop the report on the quantification of toxicological effects for dichloromethane are the health assessment document (HAD) for dichloromethane and a subsequent addendum to the HAD (U.S. EPA, 1985b). In addition, some references published since 1985 are discussed. To summarize the results of the quantification of toxicological effects, a One-day Health Advisory of 10,000 ug/L for a 10-kg child was calculated, based on an acute oral study in rats reported by Kimura et al. (1971). No suitable data for the derivation of a Ten-day Health Advisory were found in the available literature.

  5. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. PMID:23796718

  6. Reliability quantification and visualization for electric microgrids

    NASA Astrophysics Data System (ADS)

    Panwar, Mayank

    and parallel with the area Electric Power Systems (EPS), (3) includes the local EPS and may include portions of the area EPS, and (4) is intentionally planned. A more reliable electric power grid requires microgrids to operate in tandem with the EPS. The reliability can be quantified through various metrics for performance measure. This is done through North American Electric Reliability Corporation (NERC) metrics in North America. The microgrid differs significantly from the traditional EPS, especially at asset level due to heterogeneity in assets. Thus, the performance cannot be quantified by the same metrics as used for EPS. Some of the NERC metrics are calculated and interpreted in this work to quantify performance for a single asset and group of assets in a microgrid. Two more metrics are introduced for system level performance quantification. The next step is a better representation of the large amount of data generated by the microgrid. Visualization is one such form of representation which is explored in detail and a graphical user interface (GUI) is developed as a deliverable tool to the operator for informative decision making and planning. Electronic appendices-I and II contain data and MATLAB© program codes for analysis and visualization for this work.

  7. Remote sensing for quantification of agronomic properties

    NASA Astrophysics Data System (ADS)

    Sullivan, Dana Grace

    Remote sensing (RS) may be used to rapidly assess surface features and facilitate natural resource management, precision agriculture and soil survey. Information obtained in such a way would streamline data collection and improve diagnostic capabilities. Current RS technology has had limited testing, particularly within the Southeast. Our study was designed to evaluate RS as a rapid assessment tool in three different natural resource applications: nitrogen (N) management in a corn crop (Zea mays L.), assessment of in situ crop residue cover, and quantification of near-surface soil properties. In 2000, study sites were established in four physiographic provinces of Alabama: Tennessee Valley, Ridge and Valley, Appalachian Plateau, and Coastal Plain. Spectral measurements were acquired via spectroradiometer (350--1050 nm), airborne ATLAS multispectral scanner (400--12,500 nm), and IKONOS satellite (450--900 nm). Corn plots were established from fresh-tilled ground in a completely randomized design at the Appalachian Plateau and Coastal Plain study sites in 2000. Plots received four N rates (0, 56, 112, and 168 kg N ha-1 ), and were maintained for three consecutive growing seasons. Spectroradiometer data were acquired biweekly from V6-R2 and ATLAS and IKONOS were acquired per availability. Results showed vegetation indices derived from hand-held spectroradiometer measurements as early as V6-V8 were linearly related to yield and tissue N. ATLAS imagery showed promise at the AP site during the V6 stage (r2 = 0.66), but no significant relationships between plant N and IKONOS imagery were observed. Residue plots (15m x 15m) were established at the Appalachian Plateau and Coastal Plain in 2000 and 200. Residue treatments consisted of hand applied wheat straw cover (0, 10 20, 50, or 80%) arranged in a completely randomized design. Spectroradiometer data were acquired monthly and ATLAS and IKONOS were acquired per availability. Residue cover estimates were best with ATLAS

  8. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  9. Rapid Quantification and Validation of Lipid Concentrations within Liposomes.

    PubMed

    Roces, Carla B; Kastner, Elisabeth; Stone, Peter; Lowry, Deborah; Perrie, Yvonne

    2016-01-01

    Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics). The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC), cholesterol, dimethyldioctadecylammonium (DDA) bromide, and ᴅ-(+)-trehalose 6,6'-dibehenate (TDB). The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R² > 0.993 for the four lipids tested). The corresponding limit of detection (LOD) and limit of quantification (LOQ) were 0.11 and 0.36 mg/mL (DMPC), 0.02 and 0.80 mg/mL (cholesterol), 0.06 and 0.20 mg/mL (DDA), and 0.05 and 0.16 mg/mL (TDB), respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes. PMID:27649231

  10. Detection and Quantification of Magnetically Labeled Cells by Cellular MRI

    PubMed Central

    Liu, Wei; Frank, Joseph A.

    2008-01-01

    Labeling cells with superparamagnetic iron oxide (SPIO) nanoparticles, paramagnetic contrast agent (gadolinium) or perfluorocarbons allows for the possibility of tracking single or clusters of labeled cells within target tissues following either direct implantation or intravenous injection. This review summarizes the practical issues regarding detection and quantification of magnetically labeled cells with various MRI contrast agents with a focus on SPIO nanoparticles. PMID:18995978

  11. Colorimetric Quantification and in Situ Detection of Collagen

    ERIC Educational Resources Information Center

    Esteban, Francisco J.; del Moral, Maria L.; Sanchez-Lopez, Ana M.; Blanco, Santos; Jimenez, Ana; Hernandez, Raquel; Pedrosa, Juan A.; Peinado, Maria A.

    2005-01-01

    A simple multidisciplinary and inexpensive laboratory exercise is proposed, in which the undergraduate student may correlate biochemical and anatomical findings. The entire practical session can be completed in one 2.5-3 hour laboratory period, and consists of the quantification of collagen and total protein content from tissue sections--without…

  12. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  13. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, Massimo; Palmiotti, Giuseppe; Aliberti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  14. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, M.; Aliberti, G.; Palmiotti, G.

    2015-01-15

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  15. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... incident. (c) Natural recovery. To quantify injury, trustees must estimate, quantitatively or qualitatively, the time for natural recovery without restoration, but including any response actions. The analysis of...—quantification. (a) General. In addition to determining whether injuries have resulted from the...

  16. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... incident. (c) Natural recovery. To quantify injury, trustees must estimate, quantitatively or qualitatively, the time for natural recovery without restoration, but including any response actions. The analysis of...—quantification. (a) General. In addition to determining whether injuries have resulted from the...

  17. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... incident. (c) Natural recovery. To quantify injury, trustees must estimate, quantitatively or qualitatively, the time for natural recovery without restoration, but including any response actions. The analysis of...—quantification. (a) General. In addition to determining whether injuries have resulted from the...

  18. Infectious Viral Quantification of Chikungunya Virus-Virus Plaque Assay.

    PubMed

    Kaur, Parveen; Lee, Regina Ching Hua; Chu, Justin Jang Hann

    2016-01-01

    The plaque assay is an essential method for quantification of infectious virus titer. Cells infected with virus particles are overlaid with a viscous substrate. A suitable incubation period results in the formation of plaques, which can be fixed and stained for visualization. Here, we describe a method for measuring Chikungunya virus (CHIKV) titers via virus plaque assays.

  19. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    PubMed Central

    Roces, Carla B.; Kastner, Elisabeth; Stone, Peter; Lowry, Deborah; Perrie, Yvonne

    2016-01-01

    Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics). The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC), cholesterol, dimethyldioctadecylammonium (DDA) bromide, and d-(+)-trehalose 6,6′-dibehenate (TDB). The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested). The corresponding limit of detection (LOD) and limit of quantification (LOQ) were 0.11 and 0.36 mg/mL (DMPC), 0.02 and 0.80 mg/mL (cholesterol), 0.06 and 0.20 mg/mL (DDA), and 0.05 and 0.16 mg/mL (TDB), respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes. PMID:27649231

  20. Quantification and Single-Spore Detection of Phakopsora pachyrhizi

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The microscopic identification and quantification of Phakopsora pachyrhizi spores from environmental samples, spore traps, and laboratory specimens can represent a challenge. Such reports, especially from passive spore traps, commonly describe the number of “rust-like” spores; for other forensic sa...

  1. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  2. Evaluation of Digital PCR for Absolute RNA Quantification

    PubMed Central

    Sanders, Rebecca; Mason, Deborah J.; Foy, Carole A.; Huggett, Jim F.

    2013-01-01

    Gene expression measurements detailing mRNA quantities are widely employed in molecular biology and are increasingly important in diagnostic fields. Reverse transcription (RT), necessary for generating complementary DNA, can be both inefficient and imprecise, but remains a quintessential RNA analysis tool using qPCR. This study developed a Transcriptomic Calibration Material and assessed the RT reaction using digital (d)PCR for RNA measurement. While many studies characterise dPCR capabilities for DNA quantification, less work has been performed investigating similar parameters using RT-dPCR for RNA analysis. RT-dPCR measurement using three, one-step RT-qPCR kits was evaluated using single and multiplex formats when measuring endogenous and synthetic RNAs. The best performing kit was compared to UV quantification and sensitivity and technical reproducibility investigated. Our results demonstrate assay and kit dependent RT-dPCR measurements differed significantly compared to UV quantification. Different values were reported by different kits for each target, despite evaluation of identical samples using the same instrument. RT-dPCR did not display the strong inter-assay agreement previously described when analysing DNA. This study demonstrates that, as with DNA measurement, RT-dPCR is capable of accurate quantification of low copy RNA targets, but the results are both kit and target dependent supporting the need for calibration controls. PMID:24073259

  3. Comparison of DNA Quantification Methods for Next Generation Sequencing

    PubMed Central

    Robin, Jérôme D.; Ludlow, Andrew T.; LaRanger, Ryan; Wright, Woodring E.; Shay, Jerry W.

    2016-01-01

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library’s heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality. PMID:27048884

  4. Quantification of Wheat Grain Arabinoxylans Using a Phloroglucinol Colorimetric Assay

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Arabinoxylans (AX) play a critical role in end-use quality and nutrition of wheat (Triticum aestivum L.). An efficient, accurate method of AX quantification is desirable as AX plays an important role in processing, end use quality and human health. The objective of this work was to evaluate a stand...

  5. A Quantification Approach to Popular American Theatre: Outline.

    ERIC Educational Resources Information Center

    Woods, Alan

    A previously relatively unexplored area of theater history studies is the quantification of titles, authors, and locations of productions of plays in Canada and the United States. Little is known, for example, about the number of times any one play was staged, especially in the earlier days of American drama. A project which counts productions on…

  6. Proteomics of Microparticles with SILAC Quantification (PROMIS-Quan): A Novel Proteomic Method for Plasma Biomarker Quantification*

    PubMed Central

    Harel, Michal; Oren-Giladi, Pazit; Kaidar-Person, Orit; Shaked, Yuval; Geiger, Tamar

    2015-01-01

    Unbiased proteomic analysis of plasma samples holds the promise to reveal clinically invaluable disease biomarkers. However, the tremendous dynamic range of the plasma proteome has so far hampered the identification of such low abundant markers. To overcome this challenge we analyzed the plasma microparticle proteome, and reached an unprecedented depth of over 3000 plasma proteins in single runs. To add a quantitative dimension, we developed PROMIS-Quan—PROteomics of MIcroparticles with Super-Stable Isotope Labeling with Amino Acids in Cell Culture (SILAC) Quantification, a novel mass spectrometry-based technology for plasma microparticle proteome quantification. PROMIS-Quan enables a two-step relative and absolute SILAC quantification. First, plasma microparticle proteomes are quantified relative to a super-SILAC mix composed of cell lines from distinct origins. Next, the absolute amounts of selected proteins of interest are quantified relative to the super-SILAC mix. We applied PROMIS-Quan to prostate cancer and compared plasma microparticle samples of healthy individuals and prostate cancer patients. We identified in total 5374 plasma-microparticle proteins, and revealed a predictive signature of three proteins that were elevated in the patient-derived plasma microparticles. Finally, PROMIS-Quan enabled determination of the absolute quantitative changes in prostate specific antigen (PSA) upon treatment. We propose PROMIS-Quan as an innovative platform for biomarker discovery, validation, and quantification in both the biomedical research and in the clinical worlds. PMID:25624350

  7. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  8. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  9. Good quantification practices of flavours and fragrances by mass spectrometry

    PubMed Central

    Begnaud, Frédéric

    2016-01-01

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644977

  10. Quantification of viable helminth eggs in samples of sewage sludge.

    PubMed

    Rocha, Maria Carolina Vieira da; Barés, Monica Eboly; Braga, Maria Cristina Borba

    2016-10-15

    For the application of sewage sludge as fertilizer, it is of fundamental importance the absence of pathogenic organisms, such as viable helminth eggs. Thus, the quantification of these organisms has to be carried out by means of the application of reliable and accurate methodologies. Nevertheless, until the present date, there is no consensus with regard to the adoption of a universal methodology for the detection and quantification of viable helminth eggs. It is therefore necessary to instigate a debate on the different protocols currently in use, as well as to assemble relevant information in order to assist in the development of a more comprehensive and accurate method to quantify viable helminth eggs in samples of sewage sludge and its derivatives. PMID:27470467

  11. Quantification techniques and biodistribution of semiconductor quantum dots.

    PubMed

    Pic, Emilie; Bezdetnaya, Lina; Guillemin, François; Marchal, Frédéric

    2009-03-01

    Quantum dots (QDs) are fluorescent inorganic nanocrystals with advantageous optical properties, which have been applied for biomedical purposes including imaging, diagnostic, drug delivery or therapy. Potential toxicity of QDs remains the major barrier to clinical translation, and as such the precise analysis of in vivo QDs distribution and pharmacokinetics is of major importance. Biodistribution studies in animal models are, however, sparse. The present review provides in a first lieu a summary of different techniques, which are currently used for relative quantification of QDs in vivo or their absolute quantification ex vivo. Fluorescence and radioactivity based techniques along with mass-spectrometry detection at the elementary level are addressed in this review. We further introduce biodistribution studies in animal models and discuss the possibilities to modify quantum dots biodistribution in function of different injection ways.

  12. Frequency feature based quantification of defect depth and thickness

    NASA Astrophysics Data System (ADS)

    Tian, Shulin; Chen, Kai; Bai, Libing; Cheng, Yuhua; Tian, Lulu; Zhang, Hong

    2014-06-01

    This study develops a frequency feature based pulsed eddy current method. A frequency feature, termed frequency to zero, is proposed for subsurface defects and metal loss quantification in metallic specimens. A curve fitting method is also employed to generate extra frequency components and improve the accuracy of the proposed method. Experimental validation is carried out. Conclusions and further work are derived on the basis of the studies.

  13. Frequency feature based quantification of defect depth and thickness.

    PubMed

    Tian, Shulin; Chen, Kai; Bai, Libing; Cheng, Yuhua; Tian, Lulu; Zhang, Hong

    2014-06-01

    This study develops a frequency feature based pulsed eddy current method. A frequency feature, termed frequency to zero, is proposed for subsurface defects and metal loss quantification in metallic specimens. A curve fitting method is also employed to generate extra frequency components and improve the accuracy of the proposed method. Experimental validation is carried out. Conclusions and further work are derived on the basis of the studies.

  14. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  15. Quantification of the Electrophilicity of Benzyne and Related Intermediates.

    PubMed

    Fine Nathel, Noah F; Morrill, Lucas A; Mayr, Herbert; Garg, Neil K

    2016-08-24

    The determination of reactivity parameters for short-lived intermediates provides an indispensable tool for synthetic design. Despite that electrophilicity parameters have now been established for more than 250 reactive species, the corresponding parameters for benzyne and related intermediates have not been uncovered. We report a study that has allowed for the quantification of benzyne's electrophilicity parameter. Our approach relies on the strategic use of the diffusion-clock method and also provides electrophilicity parameters E for other substituted arynes.

  16. The Challenges of Credible Thermal Protection System Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2013-01-01

    The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.

  17. Universal Quantification in a Constraint-Based Planner

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.

  18. Quantification Of Margins And Uncertainties: A Bayesian Approach (full Paper)

    SciTech Connect

    Wallstrom, Timothy C

    2008-01-01

    Quantification of Margins and Uncertainties (QMU) is 'a formalism for dealing with the reliability of complex technical systems, and the confidence which can be placed in estimates of that reliability.' (Eardleyet al, 2005). In this paper, we show how QMU may be interpreted in the framework of Bayesian statistical inference, using a probabilistic network. The Bayesian approach clarifies the probabilistic underpinnings of the formalism, and shows how the formalism can be used for deciSion-making.

  19. Diagnostic utility of droplet digital PCR for HIV reservoir quantification.

    PubMed

    Trypsteen, Wim; Kiselinova, Maja; Vandekerckhove, Linos; De Spiegelaere, Ward

    2016-01-01

    Quantitative real-time PCR (qPCR) is implemented in many molecular laboratories worldwide for the quantification of viral nucleic acids. However, over the last two decades, there has been renewed interest in the concept of digital PCR (dPCR) as this platform offers direct quantification without the need for standard curves, a simplified workflow and the possibility to extend the current detection limit. These benefits are of great interest in terms of the quantification of low viral levels in HIV reservoir research because changes in the dynamics of residual HIV reservoirs will be important to monitor HIV cure efforts. Here, we have implemented a systematic literature screening and text mining approach to map the use of droplet dPCR (ddPCR) in the context of HIV quantification. In addition, several technical aspects of ddPCR were compared with qPCR: accuracy, sensitivity, precision and reproducibility, to determine its diagnostic utility. We have observed that ddPCR was used in different body compartments in multiple HIV-1 and HIV-2 assays, with the majority of reported assays focusing on HIV-1 DNA-based applications (i.e. total HIV DNA). Furthermore, ddPCR showed a higher accuracy, precision and reproducibility, but similar sensitivity when compared to qPCR due to reported false positive droplets in the negative template controls with a need for standardised data analysis (i.e. threshold determination). In the context of a low level of detection and HIV reservoir diagnostics, ddPCR can offer a valid alternative to qPCR-based assays but before this platform can be clinically accredited, some remaining issues need to be resolved. PMID:27482456

  20. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  1. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  2. Initial water quantification results using neutron computed tomography

    NASA Astrophysics Data System (ADS)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  3. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  4. Zonated quantification of steatosis in an entire mouse liver.

    PubMed

    Schwen, Lars Ole; Homeyer, André; Schwier, Michael; Dahmen, Uta; Dirsch, Olaf; Schenk, Arne; Kuepfer, Lars; Preusser, Tobias; Schenk, Andrea

    2016-06-01

    Many physiological processes and pathological conditions in livers are spatially heterogeneous, forming patterns at the lobular length scale or varying across the organ. Steatosis, a common liver disease characterized by lipids accumulating in hepatocytes, exhibits heterogeneity at both these spatial scales. The main goal of the present study was to provide a method for zonated quantification of the steatosis patterns found in an entire mouse liver. As an example application, the results were employed in a pharmacokinetics simulation. For the analysis, an automatic detection of the lipid vacuoles was used in multiple slides of histological serial sections covering an entire mouse liver. Lobuli were determined semi-automatically and zones were defined within the lobuli. Subsequently, the lipid content of each zone was computed. The steatosis patterns were found to be predominantly periportal, with a notable organ-scale heterogeneity. The analysis provides a quantitative description of the extent of steatosis in unprecedented detail. The resulting steatosis patterns were successfully used as a perturbation to the liver as part of an exemplary whole-body pharmacokinetics simulation for the antitussive drug dextromethorphan. The zonated quantification is also applicable to other pathological conditions that can be detected in histological images. Besides being a descriptive research tool, this quantification could perspectively complement diagnosis based on visual assessment of histological images. PMID:27104496

  5. Gas plume quantification in downlooking hyperspectral longwave infrared images

    NASA Astrophysics Data System (ADS)

    Turcotte, Caroline S.; Davenport, Michael R.

    2010-10-01

    Algorithms have been developed to support quantitative analysis of a gas plume using down-looking airborne hyperspectral long-wave infrared (LWIR) imagery. The resulting gas quantification "GQ" tool estimates the quantity of one or more gases at each pixel, and estimates uncertainty based on factors such as atmospheric transmittance, background clutter, and plume temperature contrast. GQ uses gas-insensitive segmentation algorithms to classify the background very precisely so that it can infer gas quantities from the differences between plume-bearing pixels and similar non-plume pixels. It also includes MODTRAN-based algorithms to iteratively assess various profiles of air temperature, water vapour, and ozone, and select the one that implies smooth emissivity curves for the (unknown) materials on the ground. GQ then uses a generalized least-squares (GLS) algorithm to simultaneously estimate the most likely mixture of background (terrain) material and foreground plume gases. Cross-linking of plume temperature to the estimated gas quantity is very non-linear, so the GLS solution was iteratively assessed over a range of plume temperatures to find the best fit to the observed spectrum. Quantification errors due to local variations in the camera-topixel distance were suppressed using a subspace projection operator. Lacking detailed depth-maps for real plumes, the GQ algorithm was tested on synthetic scenes generated by the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software. Initial results showed pixel-by-pixel gas quantification errors of less than 15% for a Freon 134a plume.

  6. Automatic quantification of neo-vasculature from micro-CT

    NASA Astrophysics Data System (ADS)

    Mallya, Yogish; Narayanan, A. K.; Zagorchev, Lyubomir

    2009-02-01

    Angiogenesis is the process of formation of new blood vessels as outgrowths of pre-existing ones. It occurs naturally during development, tissue repair, and abnormally in pathologic diseases such as cancer. It is associated with proliferation of blood vessels/tubular sprouts that penetrate deep into tissues to supply nutrients and remove waste products. The process starts with migration of endothelial cells. As the cells move towards the target area they form small tubular sprouts recruited from the parent vessel. The sprouts grow in length due to migration, proliferation, and recruitment of new endothelial cells and the process continues until the target area becomes fully vascular. Accurate quantification of sprout formation is very important for evaluation of treatments for ischemia as well as angiogenesis inhibitors and plays a key role in the battle against cancer. This paper presents a technique for automatic quantification of newly formed blood vessels from Micro-CT volumes of tumor samples. A semiautomatic technique based on interpolation of Bezier curves was used to segment out the cancerous growths. Small vessels as determined by their diameter within the segmented tumors were enhanced and quantified with a multi-scale 3-D line detection filter. The same technique can be easily extended for quantification of tubular structures in other 3-D medical imaging modalities. Experimental results are presented and discussed.

  7. Label-Free Identification and Quantification of SUMO Target Proteins.

    PubMed

    Hendriks, Ivo A; Vertegaal, Alfred C O

    2016-01-01

    Mass spectrometry-based approaches are utilized with increasing frequency to facilitate identification of novel SUMO target proteins and to elucidate the dynamics of SUMOylation in response to cellular stresses. Here, we describe a robust method for the identification of SUMO target proteins, and the relative quantification of SUMOylation dynamics, using a label-free approach. The method relies on a decahistidine (His10)-tagged SUMO, which is expressed at a low level in a mammalian cell line or model organism. The His10-tag allows for a single-step, high-yield, and high-purity enrichment of SUMOylated proteins, which are then digested and analyzed by high-resolution mass spectrometry. Matching between runs and label-free quantification integrated in the freely available MaxQuant software allow for a high rate and accuracy of quantification, providing a strong alternative to laborious sample or cell labeling techniques. The method described here allows for identification of >1000 SUMO target proteins, and characterization of their SUMOylation dynamics, without requiring sample fractionation. The purification procedure, starting from total lysate, can be performed in ~4 days. PMID:27631806

  8. Assessment methods for angiogenesis and current approaches for its quantification

    PubMed Central

    AlMalki, Waleed Hassan; Shahid, Imran; Mehdi, Abeer Yousaf; Hafeez, Muhammad Hassan

    2014-01-01

    Angiogenesis is a physiological process which describes the development of new blood vessels from the existing vessels. It is a common and the most important process in the formation and development of blood vessels, so it is supportive in the healing of wounds and granulation of tissues. The different assays for the evaluation of angiogenesis have been described with distinct advantages and some limitations. In order to develop angiogenic and antiangiogenic techniques, continuous efforts have been resulted to give animal models for more quantitative analysis of angiogenesis. Most of the studies on angiogenic inducers and inhibitors rely on various models, both in vitro, in vivo and in ova, as indicators of efficacy. The angiogenesis assays are very much helpful to test efficacy of both pro- and anti- angiogenic agents. The development of non-invasive procedures for quantification of angiogenesis will facilitate this process significantly. The main objective of this review article is to focus on the novel and existing methods of angiogenesis and their quantification techniques. These findings will be helpful to establish the most convenient methods for the detection, quantification of angiogenesis and to develop a novel, well tolerated and cost effective anti-angiogenic treatment in the near future. PMID:24987169

  9. In vivo behavior of NTBI revealed by automated quantification system.

    PubMed

    Ito, Satoshi; Ikuta, Katsuya; Kato, Daisuke; Lynda, Addo; Shibusa, Kotoe; Niizeki, Noriyasu; Toki, Yasumichi; Hatayama, Mayumi; Yamamoto, Masayo; Shindo, Motohiro; Iizuka, Naomi; Kohgo, Yutaka; Fujiya, Mikihiro

    2016-08-01

    Non-Tf-bound iron (NTBI), which appears in serum in iron overload, is thought to contribute to organ damage; the monitoring of serum NTBI levels may therefore be clinically useful in iron-overloaded patients. However, NTBI quantification methods remain complex, limiting their use in clinical practice. To overcome the technical difficulties often encountered, we recently developed a novel automated NTBI quantification system capable of measuring large numbers of samples. In the present study, we investigated the in vivo behavior of NTBI in human and animal serum using this newly established automated system. Average NTBI in healthy volunteers was 0.44 ± 0.076 μM (median 0.45 μM, range 0.28-0.66 μM), with no significant difference between sexes. Additionally, serum NTBI rapidly increased after iron loading, followed by a sudden disappearance. NTBI levels also decreased in inflammation. The results indicate that NTBI is a unique marker of iron metabolism, unlike other markers of iron metabolism, such as serum ferritin. Our new automated NTBI quantification method may help to reveal the clinical significance of NTBI and contribute to our understanding of iron overload. PMID:27086349

  10. Quantification of splice variants using molecular beacon or scorpion primers.

    PubMed

    Taveau, Mathieu; Stockholm, Daniel; Spencer, Melissa; Richard, Isabelle

    2002-06-15

    Uncovering the relationship between the generation of alternative transcripts and cellular processes is of crucial importance in the exploration of a gene's biology. The description and quantification of the spatiotemporal splicing pattern can be one method to select the most interesting transcripts for future studies. Fluorescence-based real-time quantitative RT-PCR has recently revolutionized the possibilities for transcriptional quantification studies. In this report, Molecular Beacon and Scorpion probes have been tested as new possibilities for determining the expression level of alternative transcripts. We validated these systems by analyzing alternative splicing of exons 6, 15, and 16 of the calpain 3 gene with tissues containing large variation in the ratio of the different transcripts. We determined conditions that demonstrated that boundary probes are useful tools and good alternatives to boundary primers, when developing a system to quantify specific transcripts. We suggest that the choice of a quantification system should depend in part on the structure and base composition of the gene and may have to be determined experimentally.

  11. Simultaneous quantification of sialyloligosaccharides from human milk by capillary electrophoresis

    PubMed Central

    Bao, Yuanwu; Zhu, Libin; Newburg, David S.

    2007-01-01

    The acidic oligosaccharides of human milk are predominantly sialyloligosaccharides. Pathogens that bind sialic acid-containing glycans on their host mucosal surfaces may be inhibited by human milk sialyloligosaccharides, but testing this hypothesis requires their reliable quantification in milk. Sialyloligosaccharides have been quantified by anion exchange HPLC, reverse or normal phase HPLC, and capillary electrophoresis (CE) of fluorescent derivatives; in milk, these oligosaccharides have been analyzed by high pH anion exchange chromatography with pulsed amperometric detection, and, in our laboratory, by CE with detection at 205 nm. The novel method described herein uses a running buffer of aqueous 200 mM NaH2PO4 at pH 7.05 containing 100 mM SDS made 45% (v/v) with methanol to baseline resolve five oligosaccharides, and separate all 12. This allows automated simultaneous quantification of the 12 major sialyloligosaccharides of human milk in a single 35-minute run. This method revealed differences in sialyloligosaccharide concentrations between less and more mature milk from the same donors. Individual donors also varied in expression of sialyloligosaccharides in their milk. Thus, the facile quantification of sialyloligosaccharides by this method is suitable for measuring variation in expression of specific sialyloligosaccharides in milk and their relationship to decreased risk of specific diseases in infants. PMID:17761135

  12. The role of PET quantification in cardiovascular imaging

    PubMed Central

    Slomka, Piotr; Berman, Daniel S.; Alexanderson, Erick; Germano, Guido

    2014-01-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose (18FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13N-ammonia, 15O-water and 82Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18FDG and 18F-sodium fluoride tracers in carotids, aorta and coronary arteries has been

  13. A flexible numerical approach for quantification of epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoxiao; Park, Eun-Jae; Xiu, Dongbin

    2013-05-01

    In the field of uncertainty quantification (UQ), epistemic uncertainty often refers to the kind of uncertainty whose complete probabilistic description is not available, largely due to our lack of knowledge about the uncertainty. Quantification of the impacts of epistemic uncertainty is naturally difficult, because most of the existing stochastic tools rely on the specification of the probability distributions and thus do not readily apply to epistemic uncertainty. And there have been few studies and methods to deal with epistemic uncertainty. A recent work can be found in [J. Jakeman, M. Eldred, D. Xiu, Numerical approach for quantification of epistemic uncertainty, J. Comput. Phys. 229 (2010) 4648-4663], where a framework for numerical treatment of epistemic uncertainty was proposed. The method is based on solving an encapsulation problem, without using any probability information, in a hypercube that encapsulates the unknown epistemic probability space. If more probabilistic information about the epistemic variables is known a posteriori, the solution statistics can then be evaluated at post-process steps. In this paper, we present a new method, similar to that of Jakeman et al. but significantly extending its capabilities. Most notably, the new method (1) does not require the encapsulation problem to be in a bounded domain such as a hypercube; (2) does not require the solution of the encapsulation problem to converge point-wise. In the current formulation, the encapsulation problem could reside in an unbounded domain, and more importantly, its numerical approximation could be sought in Lp norm. These features thus make the new approach more flexible and amicable to practical implementation. Both the mathematical framework and numerical analysis are presented to demonstrate the effectiveness of the new approach.

  14. Quantification of brain endocannabinoid levels: methods, interpretations and pitfalls

    PubMed Central

    Buczynski, Matthew W; Parsons, Loren H

    2010-01-01

    Endocannabinoids play an important role in a diverse range of neurophysiological processes including neural development, neuroimmune function, synaptic plasticity, pain, reward and affective state. This breadth of influence and evidence for altered endocannabinoid signalling in a variety of neuropathologies has fuelled interest in the accurate quantification of these lipids in brain tissue. Established methods for endocannabinoid quantification primarily employ solvent-based lipid extraction with further sample purification by solid phase extraction. In recent years in vivo microdialysis methods have also been developed for endocannabinoid sampling from the brain interstitial space. However, considerable variability in estimates of endocannabinoid content has led to debate regarding the physiological range of concentrations present in various brain regions. This paper provides a critical review of factors that influence the quantification of brain endocannabinoid content as determined by lipid extraction from bulk tissue and by in vivo microdialysis. A variety of methodological issues are discussed including analytical approaches, endocannabinoid extraction and purification, post-mortem changes in brain endocannabinoid content, cellular reactions to microdialysis probe implantation and caveats related to lipid sampling from the extracellular space. The application of these methods for estimating brain endocannabinoid content and the effects of endocannabinoid clearance inhibition are discussed. The benefits, limitations and pitfalls associated with each approach are emphasized, with an eye toward the appropriate interpretation of data gathered by each method. This article is part of a themed issue on Cannabinoids. To view the editorial for this themed issue visit http://dx.doi.org/10.1111/j.1476-5381.2010.00831.x PMID:20590555

  15. A flexible numerical approach for quantification of epistemic uncertainty

    SciTech Connect

    Chen, Xiaoxiao; Park, Eun-Jae; Xiu, Dongbin

    2013-05-01

    In the field of uncertainty quantification (UQ), epistemic uncertainty often refers to the kind of uncertainty whose complete probabilistic description is not available, largely due to our lack of knowledge about the uncertainty. Quantification of the impacts of epistemic uncertainty is naturally difficult, because most of the existing stochastic tools rely on the specification of the probability distributions and thus do not readily apply to epistemic uncertainty. And there have been few studies and methods to deal with epistemic uncertainty. A recent work can be found in [J. Jakeman, M. Eldred, D. Xiu, Numerical approach for quantification of epistemic uncertainty, J. Comput. Phys. 229 (2010) 4648–4663], where a framework for numerical treatment of epistemic uncertainty was proposed. The method is based on solving an encapsulation problem, without using any probability information, in a hypercube that encapsulates the unknown epistemic probability space. If more probabilistic information about the epistemic variables is known a posteriori, the solution statistics can then be evaluated at post-process steps. In this paper, we present a new method, similar to that of Jakeman et al. but significantly extending its capabilities. Most notably, the new method (1) does not require the encapsulation problem to be in a bounded domain such as a hypercube; (2) does not require the solution of the encapsulation problem to converge point-wise. In the current formulation, the encapsulation problem could reside in an unbounded domain, and more importantly, its numerical approximation could be sought in L{sup p} norm. These features thus make the new approach more flexible and amicable to practical implementation. Both the mathematical framework and numerical analysis are presented to demonstrate the effectiveness of the new approach.

  16. Quantification in histopathology—Can magnetic particles help?

    NASA Astrophysics Data System (ADS)

    Mitchels, John; Hawkins, Peter; Luxton, Richard; Rhodes, Anthony

    2007-04-01

    Every year, more than 270,000 people are diagnosed with cancer in the UK alone; this means that one in three people worldwide contract cancer within their lifetime. Histopathology is the principle method for confirming cancer and directing treatment. In this paper, a novel application of magnetic particles is proposed to help address the problem of subjectivity in histopathology. Preliminary results indicate that magnetic nanoparticles cannot only be used to assist diagnosis through improving quantification but also potentially increase throughput, hence offering a way of dramatically reducing costs within the routine histopathology laboratory.

  17. Quantification of protein concentration using UV absorbance and Coomassie dyes.

    PubMed

    Noble, James E

    2014-01-01

    The measurement of a solubilized protein concentration in solution is an important assay in biochemistry research and development labs for applications ranging from enzymatic studies to providing data for biopharmaceutical lot release. Spectrophotometric protein quantification assays are methods that use UV and visible spectroscopy to rapidly determine the concentration of protein, relative to a standard, or using an assigned extinction coefficient. Where multiple samples need measurement, and/or the sample volume and concentration is limited, preparations of the Coomassie dye commonly known as the Bradford assay can be used. PMID:24423263

  18. Source-Code Instrumentation and Quantification of Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Aspect Oriented Programming (AOP) is making quantified programmatic assertions over programs that otherwise are not annotated to receive these assertions. Varieties of AOP systems are characterized by which quantified assertions they allow, what they permit in the actions of the assertions (including how the actions interact with the base code), and what mechanisms they use to achieve the overall effect. Here, we argue that all quantification is over dynamic events, and describe our preliminary work in developing a system that maps dynamic events to transformations over source code. We discuss possible applications of this system, particularly with respect to debugging concurrent systems.

  19. Quantification of risks from technology for improved plant reliability

    SciTech Connect

    Rode, D.M.

    1996-12-31

    One of the least understood and therefore appreciated threats to profitability are risks from power plant technologies such as steam generators, turbines, and electrical systems. To effectively manage technological risks, business decisions need to be based on knowledge. The scope of the paper describes a quantification or risk process that combines technical knowledge and judgments with commercial consequences. The three principle alternatives to manage risks as well as risk mitigation techniques for significant equipment within a power plant are reported. The result is to equip the decision maker with a comprehensive picture of the risk exposures enabling cost effective activities to be undertaken to improve a plant`s reliability.

  20. Recurrence plots and recurrence quantification analysis of human motion data

    NASA Astrophysics Data System (ADS)

    Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad

    2016-06-01

    The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.

  1. Development of magnetic resonance technology for noninvasive boron quantification

    SciTech Connect

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.

  2. Uncertainty Quantification in Fission Cross Section Measurements at LANSCE

    SciTech Connect

    Tovesson, F.

    2015-01-15

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  3. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    SciTech Connect

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene; Malinas, Rebecca; Peterson, Matthew Gregor; Phillips, Cynthia A.; Robinson, David G.; Woodbridge, Diane

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  4. Automatic quantification of neurite outgrowth by means of image analysis

    NASA Astrophysics Data System (ADS)

    Van de Wouwer, Gert; Nuydens, Rony; Meert, Theo; Weyn, Barbara

    2004-07-01

    A system for quantification of neurite outgrowth in in-vitro experiments is described. The system is developed for routine use in a high-throughput setting and is therefore needs fast, cheap, and robust. It relies on automated digital microscopical imaging of microtiter plates. Image analysis is applied to extract features for characterisation of neurite outgrowth. The system is tested in a dose-response experiment on PC12 cells + Taxol. The performance of the system and its ability to measure changes on neuronal morphology is studied.

  5. Trace cancer biomarker quantification using polystyrene-functionalized gold nanorods

    PubMed Central

    Wu, Jian; Li, Wei; Hajisalem, Ghazal; Lukach, Ariella; Kumacheva, Eugenia; Hof, Fraser; Gordon, Reuven

    2014-01-01

    We demonstrate the application of polystyrene-functionalized gold nanorods (AuNRs) as a platform for surface enhanced Raman scattering (SERS) quantification of the exogenous cancer biomarker Acetyl Amantadine (AcAm). We utilize the hydrophobicity of the polystyrene attached to the AuNR surface to capture the hydrophobic AcAm from solution, followed by drying and detection using SERS. We achieve a detection limit of 16 ng/mL using this platform. This result shows clinical potential for low-cost early cancer detection. PMID:25574423

  6. Uncertainty quantification in fission cross section measurements at LANSCE

    DOE PAGES

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  7. Stereometric Measurement System For Quantification Of Object Forms

    NASA Astrophysics Data System (ADS)

    Fischer, P.; Mesqui, F.; Kaeser, F.

    1986-07-01

    A three-dimensional camera unit has been developed at the Dental School of the University of Zurich for jaw motion recording. Since the system is designed to detect and reconstruct the spatial position of a light source, it can easily be used to digitize an object surface being scanned by a laser beam. This is performed by an additional device which has been developed in order to enlarge the application field of the camera unit as well as to prove the feasibility and find out the limits of such a simple and low-cost quantification system.

  8. Selected methods for quantification of community exposure to aircraft noise

    NASA Technical Reports Server (NTRS)

    Edge, P. M., Jr.; Cawthorn, J. M.

    1976-01-01

    A review of the state-of-the-art for the quantification of community exposure to aircraft noise is presented. Physical aspects, people response considerations, and practicalities of useful application of scales of measure are included. Historical background up through the current technology is briefly presented. The developments of both single-event and multiple-event scales are covered. Selective choice is made of scales currently in the forefront of interest and recommended methodology is presented for use in computer programing to translate aircraft noise data into predictions of community noise exposure. Brief consideration is given to future programing developments and to supportive research needs.

  9. Nuclear Data Uncertainty Quantification: Past, Present and Future

    SciTech Connect

    Smith, D.L.

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  10. Progressive damage state evolution and quantification in composites

    NASA Astrophysics Data System (ADS)

    Patra, Subir; Banerjee, Sourav

    2016-04-01

    Precursor damage state quantification can be helpful for safety and operation of aircraft and defense equipment's. Damage develops in the composite material in the form of matrix cracking, fiber breakages and deboning, etc. However, detection and quantification of the damage modes at their very early stage is not possible unless modifications of the existing indispensable techniques are conceived, particularly for the quantification of multiscale damages at their early stage. Here, we present a novel nonlocal mechanics based damage detection technique for precursor damage state quantification. Micro-continuum physics is used by modifying the Christoffel equation. American society of testing and materials (ASTM) standard woven carbon fiber (CFRP) specimens were tested under Tension-Tension fatigue loading at the interval of 25,000 cycles until 500,000 cycles. Scanning Acoustic Microcopy (SAM) and Optical Microscopy (OM) were used to examine the damage development at the same interval. Surface Acoustic Wave (SAW) velocity profile on a representative volume element (RVE) of the specimen were calculated at the regular interval of 50,000 cycles. Nonlocal parameters were calculated form the micromorphic wave dispersion curve at a particular frequency of 50 MHz. We used a previously formulated parameter called "Damage entropy" which is a measure of the damage growth in the material calculated with the loading cycle. Damage entropy (DE) was calculated at every pixel on the RVE and the mean of DE was plotted at the loading interval of 25,000 cycle. Growth of DE with fatigue loading cycles was observed. Optical Imaging also performed at the interval of 25,000 cycles to investigate the development of damage inside the materials. We also calculated the mean value of the Surface Acoustic Wave (SAW) velocity and plotted with fatigue cycle which is correlated further with Damage Entropy (DE). Statistical analysis of the Surface Acoustic Wave profile (SAW) obtained at different

  11. Quantification of skin wrinkles using low coherence interferometry

    NASA Astrophysics Data System (ADS)

    Oh, Jung-Taek; Kim, Beop-Min; Son, Sang-Ryoon; Lee, Sang-Won; Kim, Dong-Yoon; Kim, Youn-Soo

    2004-07-01

    We measure the skin wrinkle topology by means of low coherence interferometry (LCI), which forms the basis of the optical coherence tomography (OCT). The skin topology obtained using LCI and corresponding 2-D fast Fourier transform allow quantification of skin wrinkles. It took approximately 2 minutes to obtain 2.1 mm x 2.1 mm topological image with 4 um and 16 um resolutions in axial and transverse directions, respectively. Measurement examples show the particular case of skin contour change after-wrinkle cosmeceutical treatments and atopic dermatitis

  12. Quantification of toxicological effects for dichloromethane. Final report

    SciTech Connect

    Not Available

    1992-01-01

    The document discusses the quantification of non-carcinogenic effects and carcinogenic effects for dichloromethane. The evaluation of non-carcinogenic effects includes a study of short and long term effects in animals and humans, as well as the development of the one-day, ten-day, and long term health advisories. The evaluation of carcinogenic effects includes a categorization of carcinogenic potential and risks estimates. There is a brief discussion on existing guidelines or standards and special considerations such as high risk groups.

  13. Aspect-Oriented Programming is Quantification and Implicit Invocation

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Koga, Dennis (Technical Monitor)

    2001-01-01

    We propose that the distinguishing characteristic of Aspect-Oriented Programming (AOP) languages is that they allow programming by making quantified programmatic assertions over programs that lack local notation indicating the invocation of these assertions. This suggests that AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the interactions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are metabolism: they are sufficiently expressive to allow straightforwardly programming an AOP system within them.

  14. Nuclear magnetic resonance-based quantification of organic diphosphates.

    PubMed

    Lenevich, Stepan; Distefano, Mark D

    2011-01-15

    Phosphorylated compounds are ubiquitous in life. Given their central role, many such substrates and analogs have been prepared for subsequent evaluation. Prior to biological experiments, it is typically necessary to determine the concentration of the target molecule in solution. Here we describe a method where concentrations of stock solutions of organic diphosphates and bisphosphonates are quantified using (31)P nuclear magnetic resonance (NMR) spectroscopy with standard instrumentation using a capillary tube with a secondary standard. The method is specific and is applicable down to a concentration of 200 μM. The capillary tube provides the reference peak for quantification and deuterated solvent for locking. PMID:20833124

  15. Quantification of quantum discord in a antiferromagnetic Heisenberg compound

    SciTech Connect

    Singh, H. Chakraborty, T. Mitra, C.

    2014-04-24

    An experimental quantification of concurrence and quantum discord from heat capacity (C{sub p}) measurement performed over a solid state system has been reported. In this work, thermodynamic measurements were performed on copper nitrate (CN, Cu(NO{sub 3}){sub 2}⋅2.5H{sub 2}O) single crystals which is an alternating antiferromagnet Heisenberg spin 1/2 system. CN being a weak dimerized antiferromagnet is an ideal system to investigate correlations between spins. The theoretical expressions were used to obtain concurrence and quantum discord curves as a function of temperature from heat capacity data of a real macroscopic system, CN.

  16. Prospective Comparison of Liver Stiffness Measurements between Two Point Shear Wave Elastography Methods: Virtual Touch Quantification and Elastography Point Quantification

    PubMed Central

    Yoo, Hyunsuk; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo

    2016-01-01

    Objective To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Materials and Methods Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ2 analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). Results The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Conclusion Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement. PMID:27587964

  17. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI

    NASA Astrophysics Data System (ADS)

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-06-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  18. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Yang, Le; Zhao, Huizhong; Zhang, Leijie; Zhong, Zhiyou; Liu, Yanling; Chen, Jianhua

    A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature distribution in the cooling process. Then there was obvious difference between apple fruit temperature and environment temperature. Compared to the change of environment temperature, a long hysteresis phenomenon happened to the temperature of apple fruit body. That is to say, there was a significant temperature change of apple fruit body in a period of time after environment temperature dropping. And then the change of temerature of apple fruit body in the cooling process became slower and slower. This can explain the time delay phenomenon of biology. After that, the temperature differences of every layer increased from centre to surface of apple fruit gradually. That is to say, the minimum temperature differences closed to centre of apple fruit body and the maximum temperature differences closed to the surface of apple fruit body. Finally, the temperature of every part of apple fruit body will tend to consistent and be near to the environment temperature in the cooling process. It was related to the metabolism heat of plant body at any time.

  19. High-throughput quantification of early stages of phagocytosis.

    PubMed

    Yeo, Jeremy Changyu; Wall, Adam Alexander; Stow, Jennifer Lea; Hamilton, Nicholas Ahti

    2013-09-01

    Phagocytosis--the engulfment of cells and foreign bodies--is an important cellular process in innate immunity, development, and disease. Quantification of various stages of phagocytosis, especially in a rapid screening fashion, is an invaluable tool for elucidating protein function during this process. However, current methods for assessing phagocytosis are largely limited to flow cytometry and manual image-based assays, providing limited information. Here, we present an image-based, semi-automated phagocytosis assay to rapidly quantitate three distinct stages during the early engulfment of opsonized beads. Captured images are analyzed using the image-processing software ImageJ and quantified using a macro. Modifications to this method allowed quantification of phagocytosis only in fluorescently labeled transfected cells. Additionally, the time course of bead internalization could be measured using this approach. The assay could discriminate perturbations to stages of phagocytosis induced by known pharmacological inhibitors of filamentous actin and phosphoinositol-3-kinase. Our methodology offers the ability to automatically categorize large amounts of image data into the three early stages of phagocytosis within minutes, clearly demonstrating its potential value in investigating aberrant phagocytosis when manipulating proteins of interest in drug screens and disease.

  20. Quantification of Methylated Selenium, Sulfur, and Arsenic in the Environment

    PubMed Central

    Vriens, Bas; Ammann, Adrian A.; Hagendorfer, Harald; Lenz, Markus; Berg, Michael; Winkel, Lenny H. E.

    2014-01-01

    Biomethylation and volatilization of trace elements may contribute to their redistribution in the environment. However, quantification of volatile, methylated species in the environment is complicated by a lack of straightforward and field-deployable air sampling methods that preserve element speciation. This paper presents a robust and versatile gas trapping method for the simultaneous preconcentration of volatile selenium (Se), sulfur (S), and arsenic (As) species. Using HPLC-HR-ICP-MS and ESI-MS/MS analyses, we demonstrate that volatile Se and S species efficiently transform into specific non-volatile compounds during trapping, which enables the deduction of the original gaseous speciation. With minor adaptations, the presented HPLC-HR-ICP-MS method also allows for the quantification of 13 non-volatile methylated species and oxyanions of Se, S, and As in natural waters. Application of these methods in a peatland indicated that, at the selected sites, fluxes varied between 190–210 ng Se·m−2·d−1, 90–270 ng As·m−2·d−1, and 4–14 µg S·m−2·d−1, and contained at least 70% methylated Se and S species. In the surface water, methylated species were particularly abundant for As (>50% of total As). Our results indicate that methylation plays a significant role in the biogeochemical cycles of these elements. PMID:25047128

  1. A Spanish model for quantification and management of construction waste

    SciTech Connect

    Solis-Guzman, Jaime Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramirez-de-Arellano, Antonio

    2009-09-15

    Currently, construction and demolition waste (C and D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C and D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C and D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C and D waste volume in both new construction and demolition projects.

  2. Cell-based quantification of molecular biomarkers in histopathology specimens

    PubMed Central

    Al-Kofahi, Yousef; Lassoued, Wiem; Grama, Kedar; Nath, Sumit K; Zhu, Jianliang; Oueslati, Ridha; Feldman, Michael; Lee, William M F; Roysam, Badrinath

    2011-01-01

    Aims To investigate the use of a computer-assisted technology for objective, cell-based quantification of molecular biomarkers in specified cell types in histopathology specimens, with the aim of advancing current visual estimation or pixel-level (rather than cell-based) quantification methods. Methods and results Tissue specimens were multiplex-immunostained to reveal cell structures, cell type markers, and analytes, and imaged with multispectral microscopy. The image data were processed with novel software that automatically delineates and types each cell in the field, measures morphological features, and quantifies analytes in different subcellular compartments of specified cells. The methodology was validated with the use of cell blocks composed of differentially labelled cultured cells mixed in known proportions, and evaluated on human breast carcinoma specimens for quantifying human epidermal growth factor receptor 2, oestrogen receptor, progesterone receptor, Ki67, phospho-extracellular signal-related kinase, and phospho-S6. Automated cell-level analyses closely matched human assessments, but, predictably, differed from pixel-level analyses of the same images. Conclusions Our method reveals the type, distribution, morphology and biomarker state of each cell in the field, and allows multiple biomarkers to be quantified over specified cell types, regardless of abundance. It is ideal for studying specimens from patients in clinical trials of targeted therapeutic agents, for investigating minority stromal cell subpopulations, and for phenotypic characterization to personalize therapy and prognosis. PMID:21771025

  3. Quantification of HEV RNA by Droplet Digital PCR

    PubMed Central

    Nicot, Florence; Cazabat, Michelle; Lhomme, Sébastien; Marion, Olivier; Sauné, Karine; Chiabrando, Julie; Dubois, Martine; Kamar, Nassim; Abravanel, Florence; Izopet, Jacques

    2016-01-01

    The sensitivity of real-time PCR for hepatitis E virus (HEV) RNA quantification differs greatly among techniques. Standardized tools that measure the real quantity of virus are needed. We assessed the performance of a reverse transcription droplet digital PCR (RT-ddPCR) assay that gives absolute quantities of HEV RNA. Analytical and clinical validation was done on HEV genotypes 1, 3 and 4, and was based on open reading frame (ORF)3 amplification. The within-run and between-run reproducibilities were very good, the analytical sensitivity was 80 HEV RNA international units (IU)/mL and linearities of HEV genotype 1, 3 and 4 were very similar. Clinical validation based on 45 samples of genotype 1, 3 or 4 gave results that correlated well with a validated reverse transcription quantitative PCR (RT-qPCR) assay (Spearman rs = 0.89, p < 0.0001). The RT-ddPCR assay is a sensitive method and could be a promising tool for standardizing HEV RNA quantification in various sample types. PMID:27548205

  4. Thallium lung-to-heart quantification: three methods of evaluation

    SciTech Connect

    Harler, M.B.; Mahoney, M.; Bartlett, B.; Patel, K.; Turbiner, E.

    1986-12-01

    Lung-to-heart quantification, when used in conjunction with visual assessment of /sup 201/Tl stress test images, has been found useful in diagnosing cardiac dysfunction. The authors evaluated three methods of quantification in terms of inter- and intraobserver variability and reproducibility. Fifty anterior /sup 201/Tl stress images were quantified by each of the following methods: Method A (sum region), which involved one region of interest (ROI) in the measurement of pulmonary activity relative to that of the myocardium; Method B (count density), which required two ROIs, the lung-to-heart ratio being dependent on count density; and Method C (maximum pixel), which used the gray scale of the computer to determine the most intense pixels in the lung field and myocardium. Statistical evaluation has shown that the three methods assess clinical data equally well. Method C was found to be most reproducible in terms of inter- and intraobserver variability followed by Methods A and B. Although nearly equivalent in terms of statistics, the three methods possess inherent differences and therefore should not be used interchangeably without conversion factors.

  5. Generation, Quantification, and Tracing of Metabolically Labeled Fluorescent Exosomes.

    PubMed

    Coscia, Carolina; Parolini, Isabella; Sanchez, Massimo; Biffoni, Mauro; Boussadia, Zaira; Zanetti, Cristiana; Fiani, Maria Luisa; Sargiacomo, Massimo

    2016-01-01

    Over the last 10 years, the constant progression in exosome (Exo)-related studies highlighted the importance of these cell-derived nano-sized vesicles in cell biology and pathophysiology. Functional studies on Exo uptake and intracellular trafficking require accurate quantification to assess sufficient and/or necessary Exo particles quantum able to elicit measurable effects on target cells. We used commercially available BODIPY(®) fatty acid analogues to label a primary melanoma cell line (Me501) that highly and spontaneously secrete nanovesicles. Upon addition to cell culture, BODIPY fatty acids are rapidly incorporated into major phospholipid classes ultimately producing fluorescent Exo as direct result of biogenesis. Our metabolic labeling protocol produced bright fluorescent Exo that can be examined and quantified with conventional non-customized flow cytometry (FC) instruments by exploiting their fluorescent emission rather than light-scattering detection. Furthermore, our methodology permits the measurement of single Exo-associated fluorescence transfer to cells making quantitative the correlation between Exo uptake and activation of cellular processes. Thus the protocol presented here appears as an appropriate tool to who wants to investigate mechanisms of Exo functions in that it allows for direct and rapid characterization and quantification of fluorescent Exo number, intensity, size, and eventually evaluation of their kinetic of uptake/secretion in target cells. PMID:27317184

  6. Applying uncertainty quantification to multiphase flow computational fluid dynamics

    SciTech Connect

    Gel, A; Garg, R; Tong, C; Shahnam, M; Guenther, C

    2013-07-01

    Multiphase computational fluid dynamics plays a major role in design and optimization of fossil fuel based reactors. There is a growing interest in accounting for the influence of uncertainties associated with physical systems to increase the reliability of computational simulation based engineering analysis. The U.S. Department of Energy's National Energy Technology Laboratory (NETL) has recently undertaken an initiative to characterize uncertainties associated with computer simulation of reacting multiphase flows encountered in energy producing systems such as a coal gasifier. The current work presents the preliminary results in applying non-intrusive parametric uncertainty quantification and propagation techniques with NETL's open-source multiphase computational fluid dynamics software MFIX. For this purpose an open-source uncertainty quantification toolkit, PSUADE developed at the Lawrence Livermore National Laboratory (LLNL) has been interfaced with MFIX software. In this study, the sources of uncertainty associated with numerical approximation and model form have been neglected, and only the model input parametric uncertainty with forward propagation has been investigated by constructing a surrogate model based on data-fitted response surface for a multiphase flow demonstration problem. Monte Carlo simulation was employed for forward propagation of the aleatory type input uncertainties. Several insights gained based on the outcome of these simulations are presented such as how inadequate characterization of uncertainties can affect the reliability of the prediction results. Also a global sensitivity study using Sobol' indices was performed to better understand the contribution of input parameters to the variability observed in response variable.

  7. Simple and inexpensive quantification of ammonia in whole blood.

    PubMed

    Ayyub, Omar B; Behrens, Adam M; Heligman, Brian T; Natoli, Mary E; Ayoub, Joseph J; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μL of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p = 0.0001.

  8. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  9. Unilateral Condylar Hyperplasia: A 3-Dimensional Quantification of Asymmetry

    PubMed Central

    Maal, Thomas J. J.; Bergé, Stefaan J.; Becking, Alfred G.

    2013-01-01

    Purpose Objective quantifications of facial asymmetry in patients with Unilateral Condylar Hyperplasia (UCH) have not yet been described in literature. The aim of this study was to objectively quantify soft-tissue asymmetry in patients with UCH and to compare the findings with a control group using a new method. Material and Methods Thirty 3D photographs of patients diagnosed with UCH were compared with 30 3D photographs of healthy controls. As UCH presents particularly in the mandible, a new method was used to isolate the lower part of the face to evaluate asymmetry of this part separately. The new method was validated by two observers using 3D photographs of five patients and five controls. Results A significant difference (0.79 mm) between patients and controls whole face asymmetry was found. Intra- and inter-observer differences of 0.011 mm (−0.034–0.011) and 0.017 mm (−0.007–0.042) respectively were found. These differences are irrelevant in clinical practice. Conclusion After objective quantification, a significant difference was identified in soft-tissue asymmetry between patients with UCH and controls. The method used to isolate mandibular asymmetry was found to be valid and a suitable tool to evaluate facial asymmetry. PMID:23544063

  10. Quantification of liver fibrosis in chronic hepatitis B virus infection

    PubMed Central

    Jieanu, CF; Ungureanu, BS; Săndulescu, DL; Gheonea, IA; Tudorașcu, DR; Ciurea, ME; Purcărea, VL

    2015-01-01

    Chronic hepatitis B virus infection (HBV) is considered a global public issue with more than 78.000 people per year dying of its evolution. With liver transplantation as the only viable therapeutic option but only in end-stage disease, hepatitis B progression may generally be influenced by various factors. Assessing fibrosis stage plays an important part in future decisions on the patients’ wealth with available antiviral agents capable of preventing fibrosis passing to an end-stage liver disease. Several methods have been taken into consideration as an alternative for HBV quantification status, such as imaging techniques and serum based biomarkers. Magnetic resonance imaging, ultrasound, and elastography are considered non-invasive imaging techniques frequently used to quantify disease progression as well as patients future prognostic. Consequently, both direct and indirect biomarkers have been studied for differentiating between fibrosis stages. This paper reviews the current standings in HBV non-invasive liver fibrosis quantification, presenting the prognostic factors and available assessment procedures that might eventually replace liver biopsy. PMID:26351528

  11. Functional error modeling for uncertainty quantification in hydrogeology

    NASA Astrophysics Data System (ADS)

    Josset, L.; Ginsbourger, D.; Lunati, I.

    2015-02-01

    Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

  12. Ultrasound strain imaging for quantification of tissue function: cardiovascular applications

    NASA Astrophysics Data System (ADS)

    de Korte, Chris L.; Lopata, Richard G. P.; Hansen, Hendrik H. G.

    2013-03-01

    With ultrasound imaging, the motion and deformation of tissue can be measured. Tissue can be deformed by applying a force on it and the resulting deformation is a function of its mechanical properties. Quantification of this resulting tissue deformation to assess the mechanical properties of tissue is called elastography. If the tissue under interrogation is actively deforming, the deformation is directly related to its function and quantification of this deformation is normally referred as `strain imaging'. Elastography can be used for atherosclerotic plaques characterization, while the contractility of the heart or skeletal muscles can be assessed with strain imaging. We developed radio frequency (RF) based ultrasound methods to assess the deformation at higher resolution and with higher accuracy than commercial methods using conventional image data (Tissue Doppler Imaging and 2D speckle tracking methods). However, the improvement in accuracy is mainly achieved when measuring strain along the ultrasound beam direction, so 1D. We further extended this method to multiple directions and further improved precision by using compounding of data acquired at multiple beam steered angles. In arteries, the presence of vulnerable plaques may lead to acute events like stroke and myocardial infarction. Consequently, timely detection of these plaques is of great diagnostic value. Non-invasive ultrasound strain compounding is currently being evaluated as a diagnostic tool to identify the vulnerability of plaques. In the heart, we determined the strain locally and at high resolution resulting in a local assessment in contrary to conventional global functional parameters like cardiac output or shortening fraction.

  13. Quantification of cumulated physical fatigue at the workplace.

    PubMed

    Pichot, Vincent; Bourin, Emmanuelle; Roche, Frédéric; Garet, Martin; Gaspoz, Jean-Michel; Duverney, David; Antoniadis, Anestis; Lacour, Jean-René; Barthélémy, Jean-Claude

    2002-11-01

    Quantification of physical fatigue remains a challenge. We hypothesized that its effects on central autonomic nervous system activity could be explored for such a quantification. To test this relationship, we prospectively measured central autonomic nervous activity through nocturnal heart rate variability (HRV) in six French garbage collectors, aged 32.1+/-4.3 years, twice a week during 3 consecutive weeks of work, and during the following week of rest. Eight healthy sedentary males formed a control group. HRV indices were calculated by applying standard temporal domain and wavelet transform analyses to standard ECG recordings. During the 3 consecutive weeks of work, there was a significantly progressive decrease in HRV indices, particularly pNN50 (-34.2%, P<0.05), as well as the high (-33.3%, P<0.05) and low (-22.2%, P<0.01) frequency components of wavelet transform, while there was an increase, although non-significant, of the ratio of low to high frequencies (9.1%). During the resting period, there was a significant recovery of HRV indices, notably of its high (50.0%, P<0.05) and low (28.6%, P<0.05) frequency components. No such changes occurred in the control group. A central signature of cumulated physical fatigue can thus be detected and quantified through nocturnal autonomic nervous system activity. Its characteristics are those of a progressive parasympathetic withdrawal.

  14. Uncertainty Quantification in State Estimation using the Probabilistic Collocation Method

    SciTech Connect

    Lin, Guang; Zhou, Ning; Ferryman, Thomas A.; Tuffner, Francis K.

    2011-03-23

    In this study, a new efficient uncertainty quantification technique, probabilistic collocation method (PCM) on sparse grid points is employed to enable the evaluation of uncertainty in state estimation. The PCM allows us to use just a small number of ensembles to quantify the uncertainty in estimating the state variables of power systems. By sparse grid points, the PCM approach can handle large number of uncertain parameters in power systems with relatively lower computational cost, when comparing with classic Monte Carlo (MC) simulations. The algorithm and procedure is outlined and we demonstrate the capability and illustrate the application of PCM on sparse grid points approach on uncertainty quantification in state estimation of the IEEE 14 bus model as an example. MC simulations have also been conducted to verify accuracy of the PCM approach. By comparing the results obtained from MC simulations with PCM results for mean and standard deviation of uncertain parameters, it is evident that the PCM approach is computationally more efficient than MC simulations.

  15. An on-bacterium flow cytometric immunoassay for protein quantification.

    PubMed

    Lan, Wen-Jun; Lan, Wei; Wang, Hai-Yan; Yan, Lei; Wang, Zhe-Li

    2013-09-01

    The polystyrene bead-based flow cytometric immunoassay has been widely reported. However, the preparation of functional polystyrene bead is still inconvenient. This study describes a simple and easy on-bacterium flow cytometric immunoassay for protein quantification, in which Staphylococcus aureus (SAC) is used as an antibody-antigen carrier to replace the polystyrene bead. The SAC beads were prepared by carboxyfluorescein diacetate succinimidyl ester (CFSE) labeling, paraformaldehyde fixation and antibody binding. Carcinoembryonic antigen (CEA) and cytokeratin-19 fragment (CYFRA 21-1) proteins were used as models in the test system. Using prepared SAC beads, biotinylated proteins, and streptavidin-phycoerythrin (SA-PE), the on-bacterium flow cytometric immunoassay was validated by quantifying CEA and CYFRA 21-1 in sample. Obtained data demonstrated a concordant result between the logarithm of the protein concentration and the logarithm of the PE mean fluorescence intensity (MFI). The limit of detection (LOD) in this immunoassay was at least 0.25 ng/ml. Precision and accuracy assessments appeared that either the relative standard deviation (R.S.D.) or the relative error (R.E.) was <10%. The comparison between this immunoassay and a polystyrene bead-based flow cytometric immunoassay showed a correlation coefficient of 0.998 for serum CEA or 0.996 for serum CYFRA 21-1. In conclusion, the on-bacterium flow cytometric immunoassay may be of use in the quantification of serum protein. PMID:23739299

  16. A critical view on microplastic quantification in aquatic organisms.

    PubMed

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J J; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-11-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g(-1) w.w. for the Acid mix Method and 0.12±0.04 total microplastics g(-1) w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g(-1) w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  17. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  18. Concurrent quantification of tryptophan and its major metabolites.

    PubMed

    Lesniak, Wojciech G; Jyoti, Amar; Mishra, Manoj K; Louissaint, Nicolette; Romero, Roberto; Chugani, Diane C; Kannan, Sujatha; Kannan, Rangaramanujam M

    2013-12-15

    An imbalance in tryptophan (TRP) metabolites is associated with several neurological and inflammatory disorders. Therefore, analytical methods allowing for simultaneous quantification of TRP and its major metabolites would be highly desirable, and may be valuable as potential biomarkers. We have developed a HPLC method for concurrent quantitative determination of tryptophan, serotonin, 5-hydroxyindoleacetic acid, kynurenine, and kynurenic acid in tissue and fluids. The method utilizes the intrinsic spectroscopic properties of TRP and its metabolites that enable UV absorbance and fluorescence detection by HPLC, without additional labeling. The origin of the peaks related to analytes of interest was confirmed by UV-Vis spectral patterns using a PDA detector and mass spectrometry. The developed methods were validated in rabbit fetal brain and amniotic fluid at gestational day 29. Results are in excellent agreement with those reported in the literature for the same regions. This method allows for rapid quantification of tryptophan and four of its major metabolites concurrently. A change in the relative ratios of these metabolites can provide important insights in predicting the presence and progression of neuroinflammation in disorders such as cerebral palsy, autism, multiple sclerosis, Alzheimer disease, and schizophrenia. PMID:24036037

  19. Quantification of 5-methyl-2'-deoxycytidine in the DNA.

    PubMed

    Giel-Pietraszuk, Małgorzata; Insińska-Rak, Małgorzata; Golczak, Anna; Sikorski, Marek; Barciszewska, Mirosława; Barciszewski, Jan

    2015-01-01

    Methylation at position 5 of cytosine (Cyt) at the CpG sequences leading to formation of 5-methyl-cytosine (m(5)Cyt) is an important element of epigenetic regulation of gene expression. Modification of the normal methylation pattern, unique to each organism, leads to the development of pathological processes and diseases, including cancer. Therefore, quantification of the DNA methylation and analysis of changes in the methylation pattern is very important from a practical point of view and can be used for diagnostic purposes, as well as monitoring of the treatment progress. In this paper we present a new method for quantification of 5-methyl-2'deoxycytidine (m(5)C) in the DNA. The technique is based on conversion of m(5)C into fluorescent 3,N(4)-etheno-5-methyl-2'deoxycytidine (εm(5)C) and its identification by reversed-phase high-performance liquid chromatography (RP-HPLC). The assay was used to evaluate m(5)C concentration in DNA of calf thymus and peripheral blood of cows bred under different conditions. This approach can be applied for measuring of 5-methylcytosine in cellular DNA from different cells and tissues. PMID:26098716

  20. Structural quantification of cartilage changes using statistical parametric mapping

    NASA Astrophysics Data System (ADS)

    Tamez-Peña, José Gerardo; Barbu-McInnis, Monica; Totterman, Saara

    2007-03-01

    The early detection of Osteoarthritis (OA) treatment efficacy requires monitoring of small changes in cartilage morphology. Current approaches rely in carefully monitoring global cartilage parameters. However, they are not very sensitive to the detection of focal morphological changes in cartilage structure. This work presents the use of the statistical parametric mapping (SPM) for the detection and quantification of changes in cartilage morphology. The SPM is computed by first registering the baseline and the follow-up three dimensional (3D) reconstructions of the cartilage tissue. Once the registration is complete, the thickness changes for every cartilage point is computed which is followed by a model based estimation of the variance of thickness error. The cartilage thickness change and the variance estimations are used to compute the z-score map. The map is used to visualize and quantify significant changes in cartilage thickness. The z-map quantification provides the area of significant changes, the associated volume of changes as well as the average thickness of cartilage loss. Furthermore, thickness change distributions functions are normalized to provide the probability distribution functions (PDF). The PDF can be used to understand and quantify the differences among different treatment groups. The performance of the approach on simulated data and real subject data will be presented.

  1. Quantification of HEV RNA by Droplet Digital PCR.

    PubMed

    Nicot, Florence; Cazabat, Michelle; Lhomme, Sébastien; Marion, Olivier; Sauné, Karine; Chiabrando, Julie; Dubois, Martine; Kamar, Nassim; Abravanel, Florence; Izopet, Jacques

    2016-01-01

    The sensitivity of real-time PCR for hepatitis E virus (HEV) RNA quantification differs greatly among techniques. Standardized tools that measure the real quantity of virus are needed. We assessed the performance of a reverse transcription droplet digital PCR (RT-ddPCR) assay that gives absolute quantities of HEV RNA. Analytical and clinical validation was done on HEV genotypes 1, 3 and 4, and was based on open reading frame (ORF)3 amplification. The within-run and between-run reproducibilities were very good, the analytical sensitivity was 80 HEV RNA international units (IU)/mL and linearities of HEV genotype 1, 3 and 4 were very similar. Clinical validation based on 45 samples of genotype 1, 3 or 4 gave results that correlated well with a validated reverse transcription quantitative PCR (RT-qPCR) assay (Spearman rs = 0.89, p < 0.0001). The RT-ddPCR assay is a sensitive method and could be a promising tool for standardizing HEV RNA quantification in various sample types. PMID:27548205

  2. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Procedure for announcing analytical methods for...-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a) FDA may issue an order announcing a specific analytical method or methods for the quantification...

  3. The quantification of hydrogen and methane in contaminated groundwater: validation of robust procedures for sampling and quantification.

    PubMed

    Dorgerloh, Ute; Becker, Roland; Theissen, Hubert; Nehls, Irene

    2010-10-01

    A number of currently recommended sampling techniques for the determination of hydrogen in contaminated groundwater were compared regarding the practical proficiency in field campaigns. Key characteristics of appropriate sampling procedures are reproducibility of results, robustness against varying field conditions such as hydrostatic pressure, aquifer flow, and biological activity. Laboratory set-ups were used to investigate the most promising techniques. Bubble stripping with gas sampling bulbs yielded reproducible recovery of hydrogen and methane which could be verified for groundwater sampled in two field campaigns. The methane content of the groundwater was confirmed by analysis of directly pumped samples thus supporting the trueness of the stripping results. Laboratory set-ups and field campaigns revealed that bubble stripping of hydrogen may be restricted to the type of used pump. Concentrations of dissolved hydrogen after bubble stripping with an electrically driven submersible pump were about one order of magnitude higher than those obtained from diffusion sampling. The gas chromatographic determination for hydrogen and methane requires manual injection of gas samples and detection by a pulsed discharge detector (PDD) and allows limits of quantification of 3 nM dissolved hydrogen and 1 µg L⁻¹ dissolved methane in groundwater. The combined standard uncertainty of the bubble stripping and GC/PDD quantification of hydrogen in field samples was 7% at 7.8 nM and 18% for 78 nM. PMID:20730246

  4. Optical methods for the quantification of the fibrillation degree of bleached MFC materials.

    PubMed

    Chinga-Carrasco, Gary

    2013-05-01

    In this study, the suitability of optical devices for quantification of the fibrillation degree of bleached microfibrillated cellulose (MFC) materials has been assessed. The techniques for optical assessment include optical scanner, UV-vis spectrophotometry, turbidity, quantification of the fiber fraction and a camera system for dynamic measurements. The results show that the assessed optical devices are most adequate for quantification of the light transmittance of bleached MFC materials. Such quantification yields an estimation of the fibrillation degree. Films made of poorly fibrillated materials are opaque, while films made of highly fibrillated materials containing a major fraction of nanofibrils are translucent, with light transmittance larger than 90%. Finally, the concept of using images acquired with a CCD camera system, for estimating the fibrillation degree in dynamic conditions was exemplified. Such systems are most interesting as this will widen the applicability of optical methods for quantification of fibrillation degree online in production lines, which is expected to appear in the years to come.

  5. PCR amplification of repetitive sequences as a possible approach in relative species quantification.

    PubMed

    Ballin, N Z; Vogensen, F K; Karlsson, A H

    2012-02-01

    Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species in binary mixtures. PCR LUX primers were designed that amplify repetitive and single copy sequences to establish the species dependent number (constants) (SDC) of amplified repetitive sequences per genome. The SDCs and data from amplification of repetitive sequences were tested for their applicability to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control.

  6. Quantification in MALDI-TOF mass spectrometry of modified polymers.

    PubMed

    Walterová, Zuzana; Horský, Jiří

    2011-05-01

    MALDI-TOF mass spectrometry quantification is hampered by the poor reproducibility of the signal intensity and by molecular-mass and compositional discrimination. The addition of a suitable compound as an internal standard increases reproducibility and allows a calibration curve to be constructed. The concept was also verified with synthetic polymers but no instructions for practical implementation were given [H. Chen, M. He, J. Pei, H. He, Anal. Chem. 75 (2003) 6531-6535.], even though synthetic polymers are generally non-uniform with respect to molecular mass and composition and access to the polymer of the same molecular mass distribution and composition as that of the quantified one is thus the exception rather than rule. On the other hand, relative quantification of polymers e.g., the content of the precursor polymer in a batch of a modified polymer, is usually sought. In this particular case, the pure precursor is usually available and the modified polymer can serve as an internal standard. However, the calibration curve still cannot be constructed and the use of the internal standard has to be combined with the method of standard addition in which the precursor polymer is added directly to the analyzed sample. The experiments with simulated modified polymers, mixtures of poly(ethylene glycol) (PEG) and poly(ethylene glycol) monomethyl ether (MPEG) of similar molecular-mass distribution, revealed a power dependence of the PEG/MPEG signal-intensity ratio (MS ratio) on the PEG/MPEG concentrations ratio in the mixture (gravimetric ratio). The result was obtained using standard procedures and instrumentation, which means that the basic assumption of the standard-addition method, i.e., the proportionality of the MS and gravimetric ratios, generally cannot be taken for granted. Therefore, the multi-point combined internal-standard standard-addition method was developed and experimentally verified for the quantification of the precursor in modified polymers. In this

  7. Improved semiquantitative Western blot technique with increased quantification range.

    PubMed

    Heidebrecht, F; Heidebrecht, A; Schulz, I; Behrens, S-E; Bader, A

    2009-06-30

    With the development of new interdisciplinary fields such as systems biology, the quantitative analysis of protein expression in biological samples gains more and more importance. Although the most common method for this is ELISA, Western blot also has advantages: The separation of proteins by size allows the evaluation of only specifically bound protein. This work examines the Western blot signal chain, determines some of the parameters relevant for quantitative analysis and proposes a mathematical model of the reaction kinetics. Using this model, a semiquantitative Western blot method for simultaneous quantification of different proteins using a hyperbolic calibration curve was developed. A program was written for the purpose of hyperbolic regression that allows quick determination of the calibration curve coefficients. This program can be used also for approximation of calibration curves in other applications such as ELISA, BCA or Bradford assays. PMID:19351538

  8. Dielectrophoretic immobilization of proteins: Quantification by atomic force microscopy.

    PubMed

    Laux, Eva-Maria; Knigge, Xenia; Bier, Frank F; Wenger, Christian; Hölzel, Ralph

    2015-09-01

    The combination of alternating electric fields with nanometer-sized electrodes allows the permanent immobilization of proteins by dielectrophoretic force. Here, atomic force microscopy is introduced as a quantification method, and results are compared with fluorescence microscopy. Experimental parameters, for example the applied voltage and duration of field application, are varied systematically, and the influence on the amount of immobilized proteins is investigated. A linear correlation to the duration of field application was found by atomic force microscopy, and both microscopical methods yield a square dependence of the amount of immobilized proteins on the applied voltage. While fluorescence microscopy allows real-time imaging, atomic force microscopy reveals immobilized proteins obscured in fluorescence images due to low S/N. Furthermore, the higher spatial resolution of the atomic force microscope enables the visualization of the protein distribution on single nanoelectrodes. The electric field distribution is calculated and compared to experimental results with very good agreement to atomic force microscopy measurements.

  9. Quantification of chromatin condensation level by image processing.

    PubMed

    Irianto, Jerome; Lee, David A; Knight, Martin M

    2014-03-01

    The level of chromatin condensation is related to the silencing/activation of chromosomal territories and therefore impacts on gene expression. Chromatin condensation changes during cell cycle, progression and differentiation, and is influenced by various physicochemical and epigenetic factors. This study describes a validated experimental technique to quantify chromatin condensation. A novel image processing procedure is developed using Sobel edge detection to quantify the level of chromatin condensation from nuclei images taken by confocal microscopy. The algorithm was developed in MATLAB and used to quantify different levels of chromatin condensation in chondrocyte nuclei achieved through alteration in osmotic pressure. The resulting chromatin condensation parameter (CCP) is in good agreement with independent multi-observer qualitative visual assessment. This image processing technique thereby provides a validated unbiased parameter for rapid and highly reproducible quantification of the level of chromatin condensation.

  10. Quantification of osteolytic bone lesions in a preclinical rat trial

    NASA Astrophysics Data System (ADS)

    Fränzle, Andrea; Bretschi, Maren; Bäuerle, Tobias; Giske, Kristina; Hillengass, Jens; Bendl, Rolf

    2013-10-01

    In breast cancer, most of the patients who died, have developed bone metastasis as disease progression. Bone metastases in case of breast cancer are mainly bone destructive (osteolytic). To understand pathogenesis and to analyse response to different treatments, animal models, in our case rats, are examined. For assessment of treatment response to bone remodelling therapies exact segmentations of osteolytic lesions are needed. Manual segmentations are not only time-consuming but lack in reproducibility. Computerized segmentation tools are essential. In this paper we present an approach for the computerized quantification of osteolytic lesion volumes using a comparison to a healthy reference model. The presented qualitative and quantitative evaluation of the reconstructed bone volumes show, that the automatically segmented lesion volumes complete missing bone in a reasonable way.

  11. Quantification of tidal parameters from Solar System data

    NASA Astrophysics Data System (ADS)

    Lainey, Valéry

    2016-11-01

    Tidal dissipation is the main driver of orbital evolution of natural satellites and a key point to understand the exoplanetary system configurations. Despite its importance, its quantification from observations still remains difficult for most objects of our own Solar System. In this work, we overview the method that has been used to determine, directly from observations, the tidal parameters, with emphasis on the Love number k_2 and the tidal quality factor Q. Up-to-date values of these tidal parameters are summarized. Last, an assessment on the possible determination of the tidal ratio k_2/Q of Uranus and Neptune is done. This may be particularly relevant for coming astrometric campaigns and future space missions focused on these systems.

  12. Quantification of intracerebral steal in patients with arteriovenous malformation

    SciTech Connect

    Homan, R.W.; Devous, M.D. Sr.; Stokely, E.M.; Bonte, F.J.

    1986-08-01

    Eleven patients with angiographically and/or pathologically proved arteriovenous malformations (AVMs) were studied using dynamic, single-photon-emission computed tomography (DSPECT). Quantification of regional cerebral blood flow in structurally normal areas remote from the AVM disclosed areas of decreased flow compared with normal controls in eight of 11 patients examined. Areas of hypoperfusion correlated with altered function as manifested by epileptogenic foci and impaired cognitive function. Dynamic, single-photon-emission computed tomography provides a noninvasive technique to monitor quantitatively hemodynamic changes associated with AVMs. Our findings suggest that such changes are present in the majority of patients with AVMs and that they may be clinically significant. The potential application of regional cerebral blood flow imaging by DSPECT in the management of patients with AVMs is discussed.

  13. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  14. Quantification of Diffuse Hydrothermal Flows Using Multibeam Sonar

    NASA Astrophysics Data System (ADS)

    Ivakin, A. N.; Jackson, D. R.; Bemis, K. G.; Xu, G.

    2014-12-01

    The Cabled Observatory Vent Imaging Sonar (COVIS) deployed at the Main Endeavour node of the NEPTUNE Canada observatory has provided acoustic time series extending over 2 years. This includes 3D images of plume scattering strength and Doppler velocity measurements as well as 2D images showing regions of diffuse flow. The diffuse-flow images display the level of decorrelation between sonar echos with transmissions separated by 0.2 s. The present work aims to provide further information on the strength of diffuse flows. Two approaches are used: Measurement of the dependence of decorrelation on lag and measurement of phase shift of sonar echos, with lags in 3-hour increments up to several days. The phase shifts and decorrelation are linked to variations of temperature above the seabed, which allows quantification of those variations, their magnitudes, spatial and temporal scales, and energy spectra. These techniques are illustrated using COVIS data obtained near the Grotto vent complex.

  15. Quantification of airway deposition of intact and fragmented pollens.

    PubMed

    Horváth, Alpár; Balásházy, Imre; Farkas, Arpád; Sárkány, Zoltán; Hofmann, Werner; Czitrovszky, Aladár; Dobos, Erik

    2011-12-01

    Although pollen is one of the most widespread agents that can cause allergy, its airway transport and deposition is far from being fully explored. The objective of this study was to characterize the airway deposition of pollens and to contribute to the debate related to the increasing number of asthma attacks registered after thunderstorms. For the quantification of the deposition of inhaled pollens in the airways computer simulations were performed. Our results demonstrated that smaller and fragmented pollens may penetrate into the thoracic airways and deposit there, supporting the theory that fragmented pollen particles are responsible for the increasing incidence of asthma attacks following thunderstorms. Pollen deposition results also suggest that children are the most exposed to the allergic effects of pollens. Finally, pollens between 0.5 and 20 μm deposit more efficiently in the lung of asthmatics than in the healthy lung, especially in the bronchial region. PMID:21563012

  16. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  17. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  18. [Quantification in psychiatry: from psychometrics to quantitative psychiatry].

    PubMed

    Pichot, P

    1994-01-01

    The development of quantitative techniques to analyse psychopathological states is reviewed from the XVIIIth Century till today. As far as back as the XIXth Century, Quetelet, Louis and Galton introduced and advocated the use of quantitative methods in medical and psychological sciences. The advent of psychometry dates back 1905, when Alfred Binet published his Intelligence Scale. The construction of instruments like Wechsler and MMPI scales in the forties starts using psychometry in psychiatry. At end of World War II, historical factors (selection and guidance of military recruits) in conjunction with technical advancements (beginning of psychopharmacology, multivariate statistics development and first computers arrival) favor the growth of quantitative psychopathology that further takes four great different courses: 1. Psychometry proper, 2. Symptom-quantifying assessment scales such as BPRS or Hamilton scales, 3. New nosological models constructed using quantified psychopathological data and mathematical procedures, 4. Diagnostic systems relying on operationalized criteria based on psychopathological quantification, such as DSM III.

  19. A novel definition for quantification of mode shape complexity

    NASA Astrophysics Data System (ADS)

    Koruk, Hasan; Sanliturk, Kenan Y.

    2013-07-01

    Complex mode shapes are quite often encountered in structural dynamics. However, there is no universally accepted parameter for the quantification of mode shape complexity. After reviewing the existing methods, a novel approach is proposed in this paper in order to quantify mode shape complexity for general structures. The new parameter proposed in this paper is based on conservation of energy principle when a structure is vibrating at a specific mode during a period of vibration. The levels of complexity of the individual mode shapes of a sample structure are then quantified using the proposed new parameter and the other parameters available in the literature. The corresponding results are compared, the validity and the generality of the new parameter are demonstrated for various damping scenarios.

  20. Quantification of chicken anaemia virus by competitive polymerase chain reaction.

    PubMed

    Yamaguchi, S; Kaji, N; Munang'andu, H M; Kojima, C; Mase, M; Tsukamoto, K

    2000-08-01

    A quantitative method for chicken anaemia virus (CAV) was developed using competitive polymerase chain reaction (PCR). Competitive template was constructed by deletion of 33 nucleotides from a wildtype DNA clone of CAV. Quantification of CAV DNA molecules by the competitive PCR was rapid and highly reproducible when compared with conventional infectivity titration methods. The ratios of the viral DNA molecules and infectivity titres in MDCC-MSB1 cells varied between 1.3 and 3.55 log(10) among several isolates, suggesting the existence of different infection efficiencies to MDCC-MSB1 cells by isolates. The competitive PCR will be useful for studying CAV infection in vivo and/or in vitro.

  1. Quantification of tidal parameters from Solar System data

    NASA Astrophysics Data System (ADS)

    Lainey, Valéry

    2016-05-01

    Tidal dissipation is the main driver of orbital evolution of natural satellites and a key point to understand the exoplanetary system configurations. Despite its importance, its quantification from observations still remains difficult for most objects of our own Solar System. In this work, we overview the method that has been used to determine, directly from observations, the tidal parameters, with emphasis on the Love number k_2 and the tidal quality factor Q. Up-to-date values of these tidal parameters are summarized. Last, an assessment on the possible determination of the tidal ratio k_2/Q of Uranus and Neptune is done. This may be particularly relevant for coming astrometric campaigns and future space missions focused on these systems.

  2. Raman spectroscopy for DNA quantification in cell nucleus.

    PubMed

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate.

  3. Sensitive quantification of somatic mutations using molecular inversion probes.

    PubMed

    Hirani, Rena; Connolly, Ashley R; Putral, Lisa; Dobrovic, Alexander; Trau, Matt

    2011-11-01

    Somatic mutations in DNA can serve as cancer specific biomarkers and are increasingly being used to direct treatment. However, they can be difficult to detect in tissue biopsies because there is often only a minimal amount of sample and the mutations are often masked by the presence of wild type alleles from nontumor material in the sample. To facilitate the sensitive and specific analysis of DNA mutations in tissues, a multiplex assay capable of detecting nucleotide changes in less than 150 cells was developed. The assay extends the application of molecular inversion probes to enable sensitive discrimination and quantification of nucleotide mutations that are present in less than 0.1% of a cell population. The assay was characterized by detecting selected mutations in the KRAS gene, which has been implicated in up to 25% of all cancers. These mutations were detected in a single multiplex assay by incorporating the rapid flow cytometric readout of multiplexable DNA biosensors.

  4. Cytofluorometric Quantification of Cell Death Elicited by NLR Proteins.

    PubMed

    Sica, Valentina; Manic, Gwenola; Kroemer, Guido; Vitale, Ilio; Galluzzi, Lorenzo

    2016-01-01

    Nucleotide-binding domain and leucine-rich repeat containing (NLR) proteins, also known as NOD-like receptors, are critical components of the molecular machinery that senses intracellular danger signals to initiate an innate immune response against invading pathogens or endogenous sources of hazard. The best characterized effect of NLR signaling is the secretion of various cytokines with immunostimulatory effects, including interleukin (IL)-1β and IL-18. Moreover, at least under specific circumstances, NLRs can promote regulated variants of cell death. Here, we detail two protocols for the cytofluorometric quantification of cell death-associated parameters that can be conveniently employed to assess the lethal activity of specific NLRs or their ligands.

  5. A surrogate accelerated multicanonical Monte Carlo method for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wu, Keyi; Li, Jinglai

    2016-09-01

    In this work we consider a class of uncertainty quantification problems where the system performance or reliability is characterized by a scalar parameter y. The performance parameter y is random due to the presence of various sources of uncertainty in the system, and our goal is to estimate the probability density function (PDF) of y. We propose to use the multicanonical Monte Carlo (MMC) method, a special type of adaptive importance sampling algorithms, to compute the PDF of interest. Moreover, we develop an adaptive algorithm to construct local Gaussian process surrogates to further accelerate the MMC iterations. With numerical examples we demonstrate that the proposed method can achieve several orders of magnitudes of speedup over the standard Monte Carlo methods.

  6. [Demographic and epidemiological quantification in Balearic hygienism, 1850-1930].

    PubMed

    Pujadas-Mora, Joana-Maria

    2012-01-01

    At the end of the 19th century, social medicine promoted the use of quantification as a means to evaluate the health status of populations. In Majorca, hygienists such as the physicians Enric Fajarnés, Bernat Riera, Antoni Mayol and Emili Darder and the civil engineer Eusebi Estada sought a better understanding of health status by considering the population growth, the demographic and epidemiological profile and the influence of weather on mortality. These calculations showed that the Balearic population had a good health status in comparison to the population of mainland Spain, although less so in the international context. These results were explained by the benevolence of the insular climate, a factor that would also guarantee the success of the public health reforms proposed.

  7. Quantification of hydroxyacetone and glycolaldehyde using chemical ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    Spencer, K. M.; Beaver, M. R.; St. Clair, J. M.; Crounse, J. D.; Paulot, F.; Wennberg, P. O.

    2011-08-01

    Chemical ionization mass spectrometry (CIMS) enables online, fast, in situ detection and quantification of hydroxyacetone and glycolaldehyde. Two different CIMS approaches are demonstrated employing the strengths of single quadrupole mass spectrometry and triple quadrupole (tandem) mass spectrometry. Both methods are capable of the measurement of hydroxyacetone, an analyte with minimal isobaric interferences. Tandem mass spectrometry provides direct separation of the isobaric compounds glycolaldehyde and acetic acid using distinct, collision-induced dissociation daughter ions. Measurement of hydroxyacetone and glycolaldehyde by these methods was demonstrated during the ARCTAS-CARB 2008 campaign and the BEARPEX 2009 campaign. Enhancement ratios of these compounds in ambient biomass burning plumes are reported for the ARCTAS-CARB campaign. BEARPEX observations are compared to simple photochemical box model predictions of biogenic volatile organic compound oxidation at the site.

  8. Thermostability of Biological Systems: Fundamentals, Challenges, and Quantification

    PubMed Central

    He, Xiaoming

    2011-01-01

    This review examines the fundamentals and challenges in engineering/understanding the thermostability of biological systems over a wide temperature range (from the cryogenic to hyperthermic regimen). Applications of the bio-thermostability engineering to either destroy unwanted or stabilize useful biologicals for the treatment of diseases in modern medicine are first introduced. Studies on the biological responses to cryogenic and hyperthermic temperatures for the various applications are reviewed to understand the mechanism of thermal (both cryo and hyperthermic) injury and its quantification at the molecular, cellular and tissue/organ levels. Methods for quantifying the thermophysical processes of the various applications are then summarized accounting for the effect of blood perfusion, metabolism, water transport across cell plasma membrane, and phase transition (both equilibrium and non-equilibrium such as ice formation and glass transition) of water. The review concludes with a summary of the status quo and future perspectives in engineering the thermostability of biological systems. PMID:21769301

  9. Graphene wrinkling induced by monodisperse nanoparticles: facile control and quantification

    PubMed Central

    Vejpravova, Jana; Pacakova, Barbara; Endres, Jan; Mantlikova, Alice; Verhagen, Tim; Vales, Vaclav; Frank, Otakar; Kalbac, Martin

    2015-01-01

    Controlled wrinkling of single-layer graphene (1-LG) at nanometer scale was achieved by introducing monodisperse nanoparticles (NPs), with size comparable to the strain coherence length, underneath the 1-LG. Typical fingerprint of the delaminated fraction is identified as substantial contribution to the principal Raman modes of the 1-LG (G and G’). Correlation analysis of the Raman shift of the G and G’ modes clearly resolved the 1-LG in contact and delaminated from the substrate, respectively. Intensity of Raman features of the delaminated 1-LG increases linearly with the amount of the wrinkles, as determined by advanced processing of atomic force microscopy data. Our study thus offers universal approach for both fine tuning and facile quantification of the graphene topography up to ~60% of wrinkling. PMID:26530787

  10. Experimental investigations for uncertainty quantification in brake squeal analysis

    NASA Astrophysics Data System (ADS)

    Renault, A.; Massa, F.; Lallemand, B.; Tison, T.

    2016-04-01

    The aim of this paper is to improve the correlation between the experimental and the numerical prediction of unstable frequencies for automotive brake systems considering uncertainty. First, an experimental quantification of uncertainty and a discussion analysing the contributions of uncertainty to a numerical squeal simulation are proposed. Frequency and transient simulations are performed considering nominal values of model parameters, determined experimentally. The obtained results are compared with those derived from experimental tests to highlight the limitation of deterministic simulations. The effects of the different kinds of uncertainty detected in working conditions of brake system, the pad boundary condition, the brake system material properties and the pad surface topography are discussed by defining different unstable mode classes. Finally, a correlation between experimental and numerical results considering uncertainty is successfully proposed for an industrial brake system. Results from the different comparisons reveal also a major influence of the pad topography and consequently the contact distribution.

  11. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  12. From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches

    PubMed Central

    Potter, Kristin; Rosen, Paul; Johnson, Chris R.

    2014-01-01

    Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly used in visualization. We then look at various approaches to visualizing this uncertainty by partitioning the work based on the dimensionality of the data and the dimensionality of the uncertainty. We also discuss noteworthy exceptions to our taxonomy along with future research directions for the uncertainty visualization community. PMID:25663949

  13. Aspect-Oriented Programming is Quantification and Obliviousness

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  14. [Comparison of methods for quantification of MVOC in indoor environments].

    PubMed

    Fischer, G; Möller, M; Gabrio, T; Palmgren, U; Keller, R; Richter, H; Dott, W; Paul, R

    2005-01-01

    For several years now, MVOC have been regarded as indicators for microbial growth in indoor environments. Until now, a direct correlation between the occurrence of microfungi and MVOC could not be shown in scientific investigations. One reason may be that different analytical methods were applied, and moreover they were not validated sufficiently. The present investigation aimed to test the comparability of both methods (Tenax adsorption/thermal desorption; charcoal adsorption/elution). It turned out that with both methods comparable results can be achieved if the technical handling of the calibration is standardized to a wide extent. Thus, highest demands have to be made on quality assurance. Therefore, it is necessary to formulate technical regulations for the quantification of MVOC.

  15. Performance and Limitations of Phosphate Quantification: Guidelines for Plant Biologists.

    PubMed

    Kanno, Satomi; Cuyas, Laura; Javot, Hélène; Bligny, Richard; Gout, Elisabeth; Dartevelle, Thibault; Hanchi, Mohamed; Nakanishi, Tomoko M; Thibaud, Marie-Christine; Nussaume, Laurent

    2016-04-01

    Phosphate (Pi) is a macronutrient that is essential for plant life. Several regulatory components involved in Pi homeostasis have been identified, revealing a very high complexity at the cellular and subcellular levels. Determining the Pi content in plants is crucial to understanding this regulation, and short real-time(33)Pi uptake imaging experiments have shown Pi movement to be highly dynamic. Furthermore, gene modulation by Pi is finely controlled by localization of this ion at the tissue as well as the cellular and subcellular levels. Deciphering these regulations requires access to and quantification of the Pi pool in the various plant compartments. This review presents the different techniques available to measure, visualize and trace Pi in plants, with a discussion of the future prospects. PMID:26865660

  16. Quantification of surface emissions: An historical perspective from GEIA

    NASA Astrophysics Data System (ADS)

    Granier, C.; Denier Van Der Gon, H.; Doumbia, E. H. T.; Frost, G. J.; Guenther, A. B.; Hassler, B.; Janssens-Maenhout, G. G. A.; Lasslop, G.; Melamed, M. L.; Middleton, P.; Sindelarova, K.; Tarrason, L.; van Marle, M.; W Kaiser, J.; van der Werf, G.

    2015-12-01

    Assessments of the composition of the atmosphere and its evolution require accurate knowledge of the surface emissions of atmospheric compounds. The first community development of global surface emissions started in 1990, when GEIA was established as a component of the International Global Atmospheric Chemistry (IGAC) project. At that time, GEIA meant "Global Emissions Inventory Activity". Since its inception, GEIA has brought together people to understand emissions from anthropogenic, biomass burning and natural sources. The first goal of GEIA was to establish a "best" inventory for the base year 1985 at 1x1 degree resolution. Since then many inventories have been developed by various groups at the global and regional scale at different temporal and spatial resolutions. GEIA, which now means the "Global Emissions Initiative", has evolved into assessing, harmonizing and distributing emissions datasets. We will review the main achievements of GEIA, and show how the development and evaluation of surface emissions has evolved during the last 25 years. We will discuss the use of surface, in-situ and remote sensing observations to evaluate and improve the quantification of emissions. We will highlight the main uncertainties currently limiting emissions datasets, such as the spatial and temporal evolution of emissions at different resolutions, the quantification of emerging emission sources (such as oil/gas extraction and distribution, biofuels, etc.), the speciation of the emissions of volatile organic compounds and of particulate matter, the capacity building necessary for organizing the development of regional emissions across the world, emissions from shipping, etc. We will present the ECCAD (Emissions of Atmospheric Compounds and Compilation of Ancillary Data) database, developed as part of GEIA to facilitate the access and evaluation of emission inventories.

  17. Rapid quantification method for Legionella pneumophila in surface water.

    PubMed

    Wunderlich, Anika; Torggler, Carmen; Elsässer, Dennis; Lück, Christian; Niessner, Reinhard; Seidel, Michael

    2016-03-01

    World-wide legionellosis outbreaks caused by evaporative cooling systems have shown that there is a need for rapid screening methods for Legionella pneumophila in water. Antibody-based methods for the quantification of L. pneumophila are rapid, non-laborious, and relatively cheap but not sensitive enough for establishment as a screening method for surface and drinking water. Therefore, preconcentration methods have to be applied in advance to reach the needed sensitivity. In a basic test, monolithic adsorption filtration (MAF) was used as primary preconcentration method that adsorbs L. pneumophila with high efficiency. Ten-liter water samples were concentrated in 10 min and further reduced to 1 mL by centrifugal ultrafiltration (CeUF). The quantification of L. pneumophila strains belonging to the monoclonal subtype Bellingham was performed via flow-based chemiluminescence sandwich microarray immunoassays (CL-SMIA) in 36 min. The whole analysis process takes 90 min. A polyclonal antibody (pAb) against L. pneumophila serogroup 1-12 and a monoclonal antibody (mAb) against L. pneumophila SG 1 strain Bellingham were immobilized on a microarray chip. Without preconcentration, the detection limit was 4.0 × 10(3) and 2.8 × 10(3) CFU/mL determined by pAb and mAb 10/6, respectively. For samples processed by MAF-CeUF prior to SMIA detection, the limit of detection (LOD) could be decreased to 8.7 CFU/mL and 0.39 CFU/mL, respectively. A recovery of 99.8 ± 15.9% was achieved for concentrations between 1-1000 CFU/mL. The established combined analytical method is sensitive for rapid screening of surface and drinking water to allow fast hygiene control of L. pneumophila. PMID:26873217

  18. Quantification of the genetic risk of environmental mutagens

    SciTech Connect

    Ehling, U.H.

    1988-03-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens.

  19. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  20. Stochastic methods for uncertainty quantification in radiation transport

    SciTech Connect

    Fichtl, Erin D; Prinja, Anil K; Warsa, James S

    2009-01-01

    The use of generalized polynomial chaos (gPC) expansions is investigated for uncertainty quantification in radiation transport. The gPC represents second-order random processes in terms of an expansion of orthogonal polynomials of random variables and is used to represent the uncertain input(s) and unknown(s). We assume a single uncertain input-the total macroscopic cross section-although this does not represent a limitation of the approaches considered here. Two solution methods are examined: The Stochastic Finite Element Method (SFEM) and the Stochastic Collocation Method (SCM). The SFEM entails taking Galerkin projections onto the orthogonal basis, which, for fixed source problems, yields a linear system of fully -coupled equations for the PC coefficients of the unknown. For k-eigenvalue calculations, the SFEM system is non-linear and a Newton-Krylov method is employed to solve it. The SCM utilizes a suitable quadrature rule to compute the moments or PC coefficients of the unknown(s), thus the SCM solution involves a series of independent deterministic transport solutions. The accuracy and efficiency of the two methods are compared and contrasted. The PC coefficients are used to compute the moments and probability density functions of the unknown(s), which are shown to be accurate by comparing with Monte Carlo results. Our work demonstrates that stochastic spectral expansions are a viable alternative to sampling-based uncertainty quantification techniques since both provide a complete characterization of the distribution of the flux and the k-eigenvalue. Furthermore, it is demonstrated that, unlike perturbation methods, SFEM and SCM can handle large parameter uncertainty.

  1. Neurostereology protocol for unbiased quantification of neuronal injury and neurodegeneration

    PubMed Central

    Golub, Victoria M.; Brewer, Jonathan; Wu, Xin; Kuruba, Ramkumar; Short, Jenessa; Manchi, Maunica; Swonke, Megan; Younus, Iyan; Reddy, Doodipala Samba

    2015-01-01

    Neuronal injury and neurodegeneration are the hallmark pathologies in a variety of neurological conditions such as epilepsy, stroke, traumatic brain injury, Parkinson’s disease and Alzheimer’s disease. Quantification of absolute neuron and interneuron counts in various brain regions is essential to understand the impact of neurological insults or neurodegenerative disease progression in animal models. However, conventional qualitative scoring-based protocols are superficial and less reliable for use in studies of neuroprotection evaluations. Here, we describe an optimized stereology protocol for quantification of neuronal injury and neurodegeneration by unbiased counting of neurons and interneurons. Every 20th section in each series of 20 sections was processed for NeuN(+) total neuron and parvalbumin(+) interneuron immunostaining. The sections that contain the hippocampus were then delineated into five reliably predefined subregions. Each region was separately analyzed with a microscope driven by the stereology software. Regional tissue volume was determined by using the Cavalieri estimator, as well as cell density and cell number were determined by using the optical disector and optical fractionator. This protocol yielded an estimate of 1.5 million total neurons and 0.05 million PV(+) interneurons within the rat hippocampus. The protocol has greater predictive power for absolute counts as it is based on 3D features rather than 2D images. The total neuron counts were consistent with literature values from sophisticated systems, which are more expensive than our stereology system. This unbiased stereology protocol allows for sensitive, medium-throughput counting of total neurons in any brain region, and thus provides a quantitative tool for studies of neuronal injury and neurodegeneration in a variety of acute brain injury and chronic neurological models. PMID:26582988

  2. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  3. Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis

    PubMed Central

    Zhao, Lu; Chanon, Ann M.; Chattopadhyay, Nabanita; Dami, Imed E.; Blakeslee, Joshua J.

    2016-01-01

    Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography–mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars. PMID:27379118

  4. Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis.

    PubMed

    Zhao, Lu; Chanon, Ann M; Chattopadhyay, Nabanita; Dami, Imed E; Blakeslee, Joshua J

    2016-01-01

    Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography-mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars.

  5. Quantification and Localization of Mast Cells in Periapical Lesions

    PubMed Central

    Mahita, VN; Manjunatha, BS; Shah, R; Astekar, M; Purohit, S; Kovvuru, S

    2015-01-01

    Background: Periapical lesions occur in response to chronic irritation in periapical tissue, generally resulting from an infected root canal. Specific etiological agents of induction, participating cell population and growth factors associated with maintenance and resolution of periapical lesions are incompletely understood. Among the cells found in periapical lesions, mast cells have been implicated in the inflammatory mechanism. Aim: Quantifications and the possible role played by mast cells in the periapical granuloma and radicular cyst. Hence, this study is to emphasize the presence (localization) and quantification of mast cells in periapical granuloma and radicular cyst. Materials and Methods: A total of 30 cases and out of which 15 of periapical granuloma and 15 radicular cyst, each along with the case details from the previously diagnosed cases in the department of oral pathology were selected for the study. The gender distribution showed male 8 (53.3%) and females 7 (46.7%) in periapical granuloma cases and male 10 (66.7%) and females 5 (33.3%) in radicular cyst cases. The statistical analysis used was unpaired t-test. Results: Mean mast cell count in periapical granuloma subepithelial and deeper connective tissue, was 12.40 (0.99%) and 7.13 (0.83%), respectively. The mean mast cell counts in subepithelial and deeper connective tissue of radicular cyst were 17.64 (1.59%) and 12.06 (1.33%) respectively, which was statistically significant. No statistical significant difference was noted among males and females. Conclusion: Mast cells were more in number in radicular cyst. Based on the concept that mast cells play a critical role in the induction of inflammation, it is logical to use therapeutic agents to alter mast cell function and secretion, to thwart inflammation at its earliest phases. These findings may suggest the possible role of mast cells in the pathogenesis of periapical lesions. PMID:25861530

  6. Quantification of regional fat volume in rat MRI

    NASA Astrophysics Data System (ADS)

    Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren

    2003-05-01

    Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been

  7. Respiratory Mucosal Proteome Quantification in Human Influenza Infections

    PubMed Central

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G.; DeVincenzo, John P.; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 28 and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection. PMID:27088501

  8. Respiratory Mucosal Proteome Quantification in Human Influenza Infections.

    PubMed

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G; DeVincenzo, John P; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 2(8) and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection.

  9. Quantification of blood flow and topology in developing vascular networks.

    PubMed

    Kloosterman, Astrid; Hierck, Beerend; Westerweel, Jerry; Poelma, Christian

    2014-01-01

    Since fluid dynamics plays a critical role in vascular remodeling, quantification of the hemodynamics is crucial to gain more insight into this complex process. Better understanding of vascular development can improve prediction of the process, and may eventually even be used to influence the vascular structure. In this study, a methodology to quantify hemodynamics and network structure of developing vascular networks is described. The hemodynamic parameters and topology are derived from detailed local blood flow velocities, obtained by in vivo micro-PIV measurements. The use of such detailed flow measurements is shown to be essential, as blood vessels with a similar diameter can have a large variation in flow rate. Measurements are performed in the yolk sacs of seven chicken embryos at two developmental stages between HH 13+ and 17+. A large range of flow velocities (1 µm/s to 1 mm/s) is measured in blood vessels with diameters in the range of 25-500 µm. The quality of the data sets is investigated by verifying the flow balances in the branching points. This shows that the quality of the data sets of the seven embryos is comparable for all stages observed, and the data is suitable for further analysis with known accuracy. When comparing two subsequently characterized networks of the same embryo, vascular remodeling is observed in all seven networks. However, the character of remodeling in the seven embryos differs and can be non-intuitive, which confirms the necessity of quantification. To illustrate the potential of the data, we present a preliminary quantitative study of key network topology parameters and we compare these with theoretical design rules.

  10. Rapid quantification method for Legionella pneumophila in surface water.

    PubMed

    Wunderlich, Anika; Torggler, Carmen; Elsässer, Dennis; Lück, Christian; Niessner, Reinhard; Seidel, Michael

    2016-03-01

    World-wide legionellosis outbreaks caused by evaporative cooling systems have shown that there is a need for rapid screening methods for Legionella pneumophila in water. Antibody-based methods for the quantification of L. pneumophila are rapid, non-laborious, and relatively cheap but not sensitive enough for establishment as a screening method for surface and drinking water. Therefore, preconcentration methods have to be applied in advance to reach the needed sensitivity. In a basic test, monolithic adsorption filtration (MAF) was used as primary preconcentration method that adsorbs L. pneumophila with high efficiency. Ten-liter water samples were concentrated in 10 min and further reduced to 1 mL by centrifugal ultrafiltration (CeUF). The quantification of L. pneumophila strains belonging to the monoclonal subtype Bellingham was performed via flow-based chemiluminescence sandwich microarray immunoassays (CL-SMIA) in 36 min. The whole analysis process takes 90 min. A polyclonal antibody (pAb) against L. pneumophila serogroup 1-12 and a monoclonal antibody (mAb) against L. pneumophila SG 1 strain Bellingham were immobilized on a microarray chip. Without preconcentration, the detection limit was 4.0 × 10(3) and 2.8 × 10(3) CFU/mL determined by pAb and mAb 10/6, respectively. For samples processed by MAF-CeUF prior to SMIA detection, the limit of detection (LOD) could be decreased to 8.7 CFU/mL and 0.39 CFU/mL, respectively. A recovery of 99.8 ± 15.9% was achieved for concentrations between 1-1000 CFU/mL. The established combined analytical method is sensitive for rapid screening of surface and drinking water to allow fast hygiene control of L. pneumophila.

  11. Absolute protein quantification of the yeast chaperome under conditions of heat shock

    PubMed Central

    Mackenzie, Rebecca J.; Lawless, Craig; Holman, Stephen W.; Lanthaler, Karin; Beynon, Robert J.; Grant, Chris M.; Hubbard, Simon J.

    2016-01-01

    Chaperones are fundamental to regulating the heat shock response, mediating protein recovery from thermal‐induced misfolding and aggregation. Using the QconCAT strategy and selected reaction monitoring (SRM) for absolute protein quantification, we have determined copy per cell values for 49 key chaperones in Saccharomyces cerevisiae under conditions of normal growth and heat shock. This work extends a previous chemostat quantification study by including up to five Q‐peptides per protein to improve confidence in protein quantification. In contrast to the global proteome profile of S. cerevisiae in response to heat shock, which remains largely unchanged as determined by label‐free quantification, many of the chaperones are upregulated with an average two‐fold increase in protein abundance. Interestingly, eight of the significantly upregulated chaperones are direct gene targets of heat shock transcription factor‐1. By performing absolute quantification of chaperones under heat stress for the first time, we were able to evaluate the individual protein‐level response. Furthermore, this SRM data was used to calibrate label‐free quantification values for the proteome in absolute terms, thus improving relative quantification between the two conditions. This study significantly enhances the largely transcriptomic data available in the field and illustrates a more nuanced response at the protein level. PMID:27252046

  12. A new objective method for acquisition and quantification of reflex receptive fields.

    PubMed

    Jensen, Michael Brun; Manresa, José Biurrun; Andersen, Ole Kæseler

    2015-03-01

    The nociceptive withdrawal reflex (NWR) is a polysynaptic spinal reflex correlated with pain perception. Assessment of this objective physiological measure constitutes the core of existing methods for quantification of reflex receptive fields (RRFs), which however still suffer from a certain degree of subjective involvement. This article proposes a strictly objective methodology for RRF quantification based on automated identification of NWR thresholds (NWR-Ts). Nociceptive withdrawal reflex thresholds were determined for 10 individual stimulation sites using an interleaved up-down staircase method. Reflexes were detected from electromyography by evaluation of interval peak z scores and application of conduction velocity analysis. Reflex receptive field areas were quantified from interpolated mappings of NWR-Ts and compared with existing RRF quantifications. A total of 3 repeated measures were performed in 2 different sessions to evaluate the test-retest reliability of the various quantifications, using coefficients of repeatability (CRs) and hypothetical sample sizes. The novel quantifications based on identification of NWR-Ts showed a similar level of reliability within and between sessions, whereas existing quantifications all demonstrated worse between-session than within-session reliability. The NWR-T-based quantifications required a smaller sample size than any of the existing RRF measures to detect a clinically relevant effect in a crossover study design involving more than 1 session. Of all measures, quantification from mapping of inversed NWR-Ts demonstrated superior reliability both within (CR, 0.25) and between sessions (CR, 0.28). The study presents a more reliable and robust quantification of the RRF to be used as biomarker of pain hypersensitivity in clinical and experimental research.

  13. [Remarks on application of acupuncture instruments in acupuncture quantification and normalization studies].

    PubMed

    Liu, Jian; Fan, Xiao-Nong; Wang, Shu; Shi, Xue-Min

    2009-01-01

    Acupuncture manipulation quantification is an important link of acupuncture normalization study. Because traditional acupuncture manipulation are complicated with difficulty of quantification, acupuncture instruments provide a new way for acupuncture manipulation quantification and acupuncture normalization. It is necessary to increase the understanding of the importance of developing acupuncture instruments, strengthen the study of related theory and development of acupuncture measurement instruments and acupuncture imitation instruments, enlarge serviceable range, verify efficacy, develop the theory of acupuncture manipulation, richen the study methods of acupuncture normalization, so as to promote internationalization of acupuncture.

  14. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches.

  15. Coral Pigments: Quantification Using HPLC and Detection by Remote Sensing

    NASA Technical Reports Server (NTRS)

    Cottone, Mary C.

    1995-01-01

    Widespread coral bleaching (loss of pigments of symbiotic dinoflagellates), and the corresponding decline in coral reef health worldwide, mandates the monitoring of coral pigmentation. Samples of the corals Porites compressa and P. lobata were collected from a healthy reef at Puako, Hawaii, and chlorophyll (chl) a, peridinin, and Beta-carotene (Beta-car) were quantified using reverse-phase high performance liquid chromatography (HPLC). Detailed procedures are presented for the extraction of the coral pigments in 90% acetone, and the separation, identification, and quantification of the major zooxanthellar pigments using spectrophotometry and a modification of the HPLC system described by Mantoura and Llewellyn (1983). Beta-apo-8-carotenal was found to be inadequate as in internal standard, due to coelution with chl b and/or chl a allomer in the sample extracts. Improvements are suggested, which may result in better resolution of the major pigments and greater accuracy in quantification. Average concentrations of peridinin, chl a, and Beta-car in corals on the reef were 5.01, 8.59, and 0.29, micro-grams/cm(exp 2), respectively. Average concentrations of peridinin and Beta-car did not differ significantly between the two coral species sampled; however, the mean chl a concentration in P. compressa specimens (7.81 ,micro-grams/cm(exp 2) was significantly lower than that in P. lobata specimens (9.96 11g/cm2). Chl a concentrations determined spectrophotometrically were significantly higher than those generated through HPLC, suggesting that spectrophotometry overestimates chl a concentrations. The average ratio of chl a-to-peridinin concentrations was 1.90, with a large (53%) coefficient of variation and a significant difference between the two species sampled. Additional data are needed before conclusions can be drawn regarding average pigment concentrations in healthy corals and the consistency of the chl a/peridinin ratio. The HPLC pigment concentration values

  16. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  17. Multimodality medical image fusion: probabilistic quantification, segmentation, and registration

    NASA Astrophysics Data System (ADS)

    Wang, Yue J.; Freedman, Matthew T.; Xuan, Jian Hua; Zheng, Qinfen; Mun, Seong K.

    1998-06-01

    Multimodality medical image fusion is becoming increasingly important in clinical applications, which involves information processing, registration and visualization of interventional and/or diagnostic images obtained from different modalities. This work is to develop a multimodality medical image fusion technique through probabilistic quantification, segmentation, and registration, based on statistical data mapping, multiple feature correlation, and probabilistic mean ergodic theorems. The goal of image fusion is to geometrically align two or more image areas/volumes so that pixels/voxels representing the same underlying anatomical structure can be superimposed meaningfully. Three steps are involved. To accurately extract the regions of interest, we developed the model supported Bayesian relaxation labeling, and edge detection and region growing integrated algorithms to segment the images into objects. After identifying the shift-invariant features (i.e., edge and region information), we provided an accurate and robust registration technique which is based on matching multiple binary feature images through a site model based image re-projection. The image was initially segmented into specified number of regions. A rough contour can be obtained by delineating and merging some of the segmented regions. We applied region growing and morphological filtering to extract the contour and get rid of some disconnected residual pixels after segmentation. The matching algorithm is implemented as follows: (1) the centroids of PET/CT and MR images are computed and then translated to the center of both images. (2) preliminary registration is performed first to determine an initial range of scaling factors and rotations, and the MR image is then resampled according to the specified parameters. (3) the total binary difference of the corresponding binary maps in both images is calculated for the selected registration parameters, and the final registration is achieved when the

  18. Development of hydrate risk quantification in oil and gas production

    NASA Astrophysics Data System (ADS)

    Chaudhari, Piyush N.

    order to reduce the parametric study that may require a long duration of time using The Colorado School of Mines Hydrate Kinetic Model (CSMHyK). The evolution of the hydrate plugging risk along flowline-riser systems is modeled for steady state and transient operations considering the effect of several critical parameters such as oil-hydrate slip, duration of shut-in, and water droplet size on a subsea tieback system. This research presents a novel platform for quantification of the hydrate plugging risk, which in-turn will play an important role in improving and optimizing current hydrate management strategies. The predictive strength of the hydrate risk quantification and hydrate prediction models will have a significant impact on flow assurance engineering and design with respect to building safe and efficient hydrate management techniques for future deep-water developments.

  19. Optimization of an automatic counting system for the quantification of Staphylococcus epidermidis cells in biofilms.

    PubMed

    Freitas, Ana Isabel; Vasconcelos, Carlos; Vilanova, Manuel; Cerca, Nuno

    2014-07-01

    Biofilm formation is recognized as the main virulence factor in a variety of chronic infections. In vitro evaluation of biofilm formation is often achieved by quantification of viable or total cells. However, these methods depend on biofilm disruption, which is often achieved by vortexing or sonication. In this study, we investigated the effects of sonication on the elimination of Staphylococcus epidermidis cell clusters from biofilms grown over time, and quantification was performed by three distinct analytical techniques. Even when a higher number of sonication cycles was used, some stable cell clusters remained in the samples obtained from 48- and 72-h-old biofilms, interfering with the quantification of sessile bacteria by plate counting. On the other hand, the fluorescence microscopy automatic counting system allowed proper quantification of biofilm samples that had undergone any of the described sonication cycles, suggesting that this is a more accurate method for assessing the cell concentration in S. epidermidis biofilms, especially in mature biofilms.

  20. Best practices for metabolite quantification in drug development: updated recommendation from the European Bioanalysis Forum.

    PubMed

    Timmerman, Philip; Blech, Stefan; White, Stephen; Green, Martha; Delatour, Claude; McDougall, Stuart; Mannens, Geert; Smeraglia, John; Williams, Stephen; Young, Graeme

    2016-06-01

    Metabolite quantification and profiling continues to grow in importance in today's drug development. The guidance provided by the 2008 FDA Metabolites in Safety Testing Guidance and the subsequent ICH M3(R2) Guidance (2009) has led to a more streamlined process to assess metabolite exposures in preclinical and clinical studies in industry. In addition, the European Bioanalysis Forum (EBF) identified an opportunity to refine the strategies on metabolite quantification considering the experience to date with their recommendation paper on the subject dating from 2010 and integrating the recent discussions on the tiered approach to bioanalytical method validation with focus on metabolite quantification. The current manuscript summarizes the discussion and recommendations from a recent EBF Focus Workshop into an updated recommendation for metabolite quantification in drug development.

  1. Quantification of hesperidin in citrus-based foods using a fungal diglycosidase.

    PubMed

    Mazzaferro, Laura S; Breccia, Javier D

    2012-10-15

    A simple enzymatic-spectrophotometric method for hesperidin quantification was developed by means of a specific fungal enzyme. The method utilises the diglycosidase α-rhamnosyl-β-glucosidase (EC 3.2.1.168) to quantitatively hydrolyse hesperidin to hesperetin, and the last is measured by its intrinsic absorbance in the UV range at 323 nm. The application of this method to quantify hesperidin in orange (Citrus sinensis) juices was shown to be reliable in comparison with the standard method for flavonoid quantification (high performance liquid chromatography, HPLC). The enzymatic method was found to have a limit of quantification of 1.8 μM (1.1 mg/L) hesperidin, similar to the limit usually achieved by HPLC. Moreover, it was feasible to be applied to raw juice, without sample extraction. This feature eliminated the sample pre-treatment, which is mandatory for HPLC, with the consequent reduction of the time required for the quantification.

  2. Monte Carlo Simulation for Quantification of Light Transport Features in Apples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Light interaction with turbid biological materials involves absorption and scattering. Quantitative understanding of light propagation features in the fruit is critical to designing better optical systems for inspection of food quality. This article reports on the quantification of light propagation...

  3. Detection and quantification of delamination in laminated plates from the phase of appropriate guided wave modes

    NASA Astrophysics Data System (ADS)

    Amjad, Umar; Yadav, Susheel Kumar; Kundu, Tribikram

    2016-01-01

    Applicability of specific Lamb wave modes for delamination detection and quantification in a laminated aluminum plate is investigated. The Lamb modes were generated in the plate using a broadband piezoelectric transducer structured with a rigid electrode. Appropriate excitation frequencies and modes for inspection were selected from theoretical dispersion curves. Sensitivity of antisymmetric and symmetric modes for delamination detection and quantification has been investigated using the Hilbert-Huang transform. The mode conversion phenomenon of Lamb waves during progressive delamination is observed. The antisymmetric mode is found to be more reliable for delamination detection and quantification. In this investigation, the changes in the phase of guided Lamb wave modes are related to the degree of delamination, unlike other studies, where mostly the attenuation of the propagating waves has been related to the extent of the internal damage, such as cracks and corrosions. Appropriate features for delamination detection and quantification are extracted from the experimental data.

  4. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  5. Relative quantification of biomarkers using mixed-isotope labeling coupled with MS

    PubMed Central

    Chapman, Heidi M; Schutt, Katherine L; Dieter, Emily M; Lamos, Shane M

    2013-01-01

    The identification and quantification of important biomarkers is a critical first step in the elucidation of biological systems. Biomarkers take many forms as cellular responses to stimuli and can be manifested during transcription, translation, and/or metabolic processing. Increasingly, researchers have relied upon mixed-isotope labeling (MIL) coupled with MS to perform relative quantification of biomarkers between two or more biological samples. MIL effectively tags biomarkers of interest for ease of identification and quantification within the mass spectrometer by using isotopic labels that introduce a heavy and light form of the tag. In addition to MIL coupled with MS, a number of other approaches have been used to quantify biomarkers including protein gel staining, enzymatic labeling, metabolic labeling, and several label-free approaches that generate quantitative data from the MS signal response. This review focuses on MIL techniques coupled with MS for the quantification of protein and small-molecule biomarkers. PMID:23157360

  6. In vivo cell tracking and quantification method in adult zebrafish

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Alt, Clemens; Li, Pulin; White, Richard M.; Zon, Leonard I.; Wei, Xunbin; Lin, Charles P.

    2012-03-01

    Zebrafish have become a powerful vertebrate model organism for drug discovery, cancer and stem cell research. A recently developed transparent adult zebrafish using double pigmentation mutant, called casper, provide unparalleled imaging power in in vivo longitudinal analysis of biological processes at an anatomic resolution not readily achievable in murine or other systems. In this paper we introduce an optical method for simultaneous visualization and cell quantification, which combines the laser scanning confocal microscopy (LSCM) and the in vivo flow cytometry (IVFC). The system is designed specifically for non-invasive tracking of both stationary and circulating cells in adult zebrafish casper, under physiological conditions in the same fish over time. The confocal imaging part in this system serves the dual purposes of imaging fish tissue microstructure and a 3D navigation tool to locate a suitable vessel for circulating cell counting. The multi-color, multi-channel instrument allows the detection of multiple cell populations or different tissues or organs simultaneously. We demonstrate initial testing of this novel instrument by imaging vasculature and tracking circulating cells in CD41: GFP/Gata1: DsRed transgenic casper fish whose thrombocytes/erythrocytes express the green and red fluorescent proteins. Circulating fluorescent cell incidents were recorded and counted repeatedly over time and in different types of vessels. Great application opportunities in cancer and stem cell researches are discussed.

  7. Immobilized Particle Imaging for Quantification of Nano- and Microparticles.

    PubMed

    Cui, Jiwei; Hibbs, Benjamin; Gunawan, Sylvia T; Braunger, Julia A; Chen, Xi; Richardson, Joseph J; Hanssen, Eric; Caruso, Frank

    2016-04-12

    The quantification of nano- and microparticles is critical for diverse applications relying on the exact knowledge of the particle concentration. Although many techniques are available for counting particles, there are some limitations in regards to counting with low-scattering materials and facile counting in harsh organic solvents. Herein, we introduce an easy and rapid particle counting technique, termed "immobilized particle imaging" (IPI), to quantify fluorescent particles with different compositions (i.e., inorganic or organic), structures (i.e., solid, porous, or hollow), and sizes (50-1000 nm) dispersed in either aqueous or organic solutions. IPI is achieved by immobilizing particles of interest in a cell matrix-like scaffold (e.g., agarose) and imaging using standard microscopy techniques. Imaging a defined volume of the immobilized particles allows for the particle concentration to be calculated from the count numbers in a fixed volume. IPI provides a general and facile approach to quantify advanced nano- and microparticles, which may be helpful to researchers to obtain new insights for different applications (e.g., nanomedicine).

  8. Dynamic control and quantification of bacterial population dynamics in droplets

    PubMed Central

    Huang, Shuqiang; Srimani, Jaydeep K.; Lee, Anna J.; Zhang, Ying; Lopatkin, Allison J.; Leong, Kam W.; You, Lingchong

    2015-01-01

    Culturing and measuring bacterial population dynamics are critical to develop insights into gene regulation or bacterial physiology. Traditional methods, based on bulk culture to obtain such quantification, have the limitations of higher cost/volume of reagents, non-amendable to small size of population and more laborious manipulation. To this end, droplet-based microfluidics represents a promising alternative that is cost-effective and high-throughput. However, difficulties in manipulating the droplet environment and monitoring encapsulated bacterial population for long-term experiments limit its utilization. To overcome these limitations, we used an electrode-free injection technology to modulate the chemical environment in droplets. This ability is critical for precise control of bacterial dynamics in droplets. Moreover, we developed a trapping device for long-term monitoring of population dynamics in individual droplets for at least 240 h. We demonstrated the utility of this new microfluidic system by quantifying population dynamics of natural and engineered bacteria. Our approach can further improve the analysis for systems and synthetic biology in terms of manipulability and high temporal resolution. PMID:26005763

  9. Uncertainty quantification in the catalytic partial oxidation of methane

    NASA Astrophysics Data System (ADS)

    Navalho, Jorge E. P.; Pereira, José M. C.; Ervilha, Ana R.; Pereira, José C. F.

    2013-12-01

    This work focuses on uncertainty quantification of eight random parameters required as input for 1D modelling of methane catalytic partial oxidation within a highly dense foam reactor. Parameters related to geometrical properties, reactor thermophysics and catalyst loading are taken as uncertain. A widely applied 1D heterogeneous mathematical model that accounts for proper transport and surface chemistry steps is considered for the evaluation of deterministic samples. The non-intrusive spectral projection approach based on polynomial chaos expansion is applied to determine the stochastic temperature and species profiles along the reactor axial direction as well as their ensemble mean and error bars with a confidence interval of 95%. Probability density functions of relevant variables in specific reactor sections are also analysed. A different contribution is noticed from each random input to the total uncertainty range. Porosity, specific surface area and catalyst loading appear as the major sources of uncertainty to bulk gas and surface temperature and species molar profiles. Porosity and the mean pore diameter have an important impact on the pressure drop along the whole reactor as expected. It is also concluded that any trace of uncertainty in the eight input random variables can be almost dissipated near the catalyst outlet section for a long-enough catalyst, mainly due to the approximation to thermodynamic equilibrium.

  10. Integrated Assessment Modeling for Carbon Storage Risk and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Bromhal, G. S.; Dilmore, R.; Pawar, R.; Stauffer, P. H.; Gastelum, J.; Oldenburg, C. M.; Zhang, Y.; Chu, S.

    2013-12-01

    The National Risk Assessment Partnership (NRAP) has developed tools to perform quantitative risk assessment at site-specific locations for long-term carbon storage. The approach that is being used is to divide the storage and containment system into components (e.g., reservoirs, seals, wells, groundwater aquifers), to develop detailed models for each component, to generate reduced order models (ROMs) based on the detailed models, and to reconnect the reduced order models within an integrated assessment model (IAM). CO2-PENS, developed at Los Alamos National Lab, is being used as the IAM for the simulations in this study. The benefit of this approach is that simulations of the complete system can be generated on a relatively rapid time scale so that Monte Carlo simulation can be performed. In this study, hundreds of thousands of runs of the IAMs have been generated to estimate likelihoods of the quantity of CO2 released to the atmosphere, size of aquifer impacted by pH, size of aquifer impacted by TDS, and size of aquifer with different metals concentrations. Correlations of the output variables with different reservoir, seal, wellbore, and aquifer parameters have been generated. Importance measures have been identified, and inputs have been ranked in the order of their impact on the output quantities. Presentation will describe the approach used, representative results, and implications for how the Monte Carlo analysis is implemented on uncertainty quantification.

  11. Uncertainty quantification in nanomechanical measurements using the atomic force microscope.

    PubMed

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-11

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials. PMID:21992899

  12. Quantification of osmotic water transport in vivo using fluorescent albumin.

    PubMed

    Morelle, Johann; Sow, Amadou; Vertommen, Didier; Jamar, François; Rippe, Bengt; Devuyst, Olivier

    2014-10-15

    Osmotic water transport across the peritoneal membrane is applied during peritoneal dialysis to remove the excess water accumulated in patients with end-stage renal disease. The discovery of aquaporin water channels and the generation of transgenic animals have stressed the need for novel and accurate methods to unravel molecular mechanisms of water permeability in vivo. Here, we describe the use of fluorescently labeled albumin as a reliable indicator of osmotic water transport across the peritoneal membrane in a well-established mouse model of peritoneal dialysis. After detailed evaluation of intraperitoneal tracer mass kinetics, the technique was validated against direct volumetry, considered as the gold standard. The pH-insensitive dye Alexa Fluor 555-albumin was applied to quantify osmotic water transport across the mouse peritoneal membrane resulting from modulating dialysate osmolality and genetic silencing of the water channel aquaporin-1 (AQP1). Quantification of osmotic water transport using Alexa Fluor 555-albumin closely correlated with direct volumetry and with estimations based on radioiodinated ((125)I) serum albumin (RISA). The low intraperitoneal pressure probably accounts for the negligible disappearance of the tracer from the peritoneal cavity in this model. Taken together, these data demonstrate the appropriateness of pH-insensitive Alexa Fluor 555-albumin as a practical and reliable intraperitoneal volume tracer to quantify osmotic water transport in vivo.

  13. Quantification of Covariance in Tropical Cyclone Activity across Teleconnected Basins

    NASA Astrophysics Data System (ADS)

    Tolwinski-Ward, S. E.; Wang, D.

    2015-12-01

    Rigorous statistical quantification of natural hazard covariance across regions has important implications for risk management, and is also of fundamental scientific interest. We present a multivariate Bayesian Poisson regression model for inferring the covariance in tropical cyclone (TC) counts across multiple ocean basins and across Saffir-Simpson intensity categories. Such covariability results from the influence of large-scale modes of climate variability on local environments that can alternately suppress or enhance TC genesis and intensification, and our model also simultaneously quantifies the covariance of TC counts with various climatic modes in order to deduce the source of inter-basin TC covariability. The model explicitly treats the time-dependent uncertainty in observed maximum sustained wind data, and hence the nominal intensity category of each TC. Differences in annual TC counts as measured by different agencies are also formally addressed. The probabilistic output of the model can be probed for probabilistic answers to such questions as: - Does the relationship between different categories of TCs differ statistically by basin? - Which climatic predictors have significant relationships with TC activity in each basin? - Are the relationships between counts in different basins conditionally independent given the climatic predictors, or are there other factors at play affecting inter-basin covariability? - How can a portfolio of insured property be optimized across space to minimize risk? Although we present results of our model applied to TCs, the framework is generalizable to covariance estimation between multivariate counts of natural hazards across regions and/or across peril types.

  14. Relative Quantification of Several Plasma Proteins during Liver Transplantation Surgery

    PubMed Central

    Parviainen, Ville; Joenväärä, Sakari; Tukiainen, Eija; Ilmakunnas, Minna; Isoniemi, Helena; Renkonen, Risto

    2011-01-01

    Plasma proteome is widely used in studying changes occurring in human body during disease or other disturbances. Immunological methods are commonly used in such studies. In recent years, mass spectrometry has gained popularity in high-throughput analysis of plasma proteins. In this study, we tested whether mass spectrometry and iTRAQ-based protein quantification might be used in proteomic analysis of human plasma during liver transplantation surgery to characterize changes in protein abundances occurring during early graft reperfusion. We sampled blood from systemic circulation as well as blood entering and exiting the liver. After immunodepletion of six high-abundant plasma proteins, trypsin digestion, iTRAQ labeling, and cation-exchange fractionation, the peptides were analyzed by reverse phase nano-LC-MS/MS. In total, 72 proteins were identified of which 31 could be quantified in all patient specimens collected. Of these 31 proteins, ten, mostly medium-to-high abundance plasma proteins with a concentration range of 50–2000 mg/L, displayed relative abundance change of more than 10%. The changes in protein abundance observed in this study allow further research on the role of several proteins in ischemia-reperfusion injury during liver transplantation and possibly in other surgery. PMID:22187521

  15. Relative quantification of several plasma proteins during liver transplantation surgery.

    PubMed

    Parviainen, Ville; Joenväärä, Sakari; Tukiainen, Eija; Ilmakunnas, Minna; Isoniemi, Helena; Renkonen, Risto

    2011-01-01

    Plasma proteome is widely used in studying changes occurring in human body during disease or other disturbances. Immunological methods are commonly used in such studies. In recent years, mass spectrometry has gained popularity in high-throughput analysis of plasma proteins. In this study, we tested whether mass spectrometry and iTRAQ-based protein quantification might be used in proteomic analysis of human plasma during liver transplantation surgery to characterize changes in protein abundances occurring during early graft reperfusion. We sampled blood from systemic circulation as well as blood entering and exiting the liver. After immunodepletion of six high-abundant plasma proteins, trypsin digestion, iTRAQ labeling, and cation-exchange fractionation, the peptides were analyzed by reverse phase nano-LC-MS/MS. In total, 72 proteins were identified of which 31 could be quantified in all patient specimens collected. Of these 31 proteins, ten, mostly medium-to-high abundance plasma proteins with a concentration range of 50-2000 mg/L, displayed relative abundance change of more than 10%. The changes in protein abundance observed in this study allow further research on the role of several proteins in ischemia-reperfusion injury during liver transplantation and possibly in other surgery.

  16. Uranium quantification in semen by inductively coupled plasma mass spectrometry.

    PubMed

    Todorov, Todor I; Ejnik, John W; Guandalini, Gustavo; Xu, Hanna; Hoover, Dennis; Anderson, Larry; Squibb, Katherine; McDiarmid, Melissa A; Centeno, Jose A

    2013-01-01

    In this study we report uranium analysis for human semen samples. Uranium quantification was performed by inductively coupled plasma mass spectrometry. No additives, such as chymotrypsin or bovine serum albumin, were used for semen liquefaction, as they showed significant uranium content. For method validation we spiked 2g aliquots of pooled control semen at three different levels of uranium: low at 5 pg/g, medium at 50 pg/g, and high at 1000 pg/g. The detection limit was determined to be 0.8 pg/g uranium in human semen. The data reproduced within 1.4-7% RSD and spike recoveries were 97-100%. The uranium level of the unspiked, pooled control semen was 2.9 pg/g of semen (n=10). In addition six semen samples from a cohort of Veterans exposed to depleted uranium (DU) in the 1991 Gulf War were analyzed with no knowledge of their exposure history. Uranium levels in the Veterans' semen samples ranged from undetectable (<0.8 pg/g) to 3350 pg/g. This wide concentration range for uranium in semen is consistent with known differences in current DU body burdens in these individuals, some of whom have retained embedded DU fragments.

  17. Uncertainty Quantification for Polynomial Systems via Bernstein Expansions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper presents a unifying framework to uncertainty quantification for systems having polynomial response metrics that depend on both aleatory and epistemic uncertainties. The approach proposed, which is based on the Bernstein expansions of polynomials, enables bounding the range of moments and failure probabilities of response metrics as well as finding supersets of the extreme epistemic realizations where the limits of such ranges occur. These bounds and supersets, whose analytical structure renders them free of approximation error, can be made arbitrarily tight with additional computational effort. Furthermore, this framework enables determining the importance of particular uncertain parameters according to the extent to which they affect the first two moments of response metrics and failure probabilities. This analysis enables determining the parameters that should be considered uncertain as well as those that can be assumed to be constants without incurring significant error. The analytical nature of the approach eliminates the numerical error that characterizes the sampling-based techniques commonly used to propagate aleatory uncertainties as well as the possibility of under predicting the range of the statistic of interest that may result from searching for the best- and worstcase epistemic values via nonlinear optimization or sampling.

  18. Quantification of intracellular payload release from polymersome nanoparticles

    PubMed Central

    Scarpa, Edoardo; Bailey, Joanne L.; Janeczek, Agnieszka A.; Stumpf, Patrick S.; Johnston, Alexander H.; Oreffo, Richard O. C.; Woo, Yin L.; Cheong, Ying C.; Evans, Nicholas D.; Newman, Tracey A.

    2016-01-01

    Polymersome nanoparticles (PMs) are attractive candidates for spatio-temporal controlled delivery of therapeutic agents. Although many studies have addressed cellular uptake of solid nanoparticles, there is very little data available on intracellular release of molecules encapsulated in membranous carriers, such as polymersomes. Here, we addressed this by developing a quantitative assay based on the hydrophilic dye, fluorescein. Fluorescein was encapsulated stably in PMs of mean diameter 85 nm, with minimal leakage after sustained dialysis. No fluorescence was detectable from fluorescein PMs, indicating quenching. Following incubation of L929 cells with fluorescein PMs, there was a gradual increase in intracellular fluorescence, indicating PM disruption and cytosolic release of fluorescein. By combining absorbance measurements with flow cytometry, we quantified the real-time intracellular release of a fluorescein at a single-cell resolution. We found that 173 ± 38 polymersomes released their payload per cell, with significant heterogeneity in uptake, despite controlled synchronisation of cell cycle. This novel method for quantification of the release of compounds from nanoparticles provides fundamental information on cellular uptake of nanoparticle-encapsulated compounds. It also illustrates the stochastic nature of population distribution in homogeneous cell populations, a factor that must be taken into account in clinical use of this technology. PMID:27404770

  19. Spectroscopic detection and quantification of heme and heme degradation products.

    PubMed

    Neugebauer, U; März, A; Henkel, T; Schmitt, M; Popp, J

    2012-12-01

    Heme and heme degradation products play critical roles in numerous biological phenomena which until now have only been partially understood. One reason for this is the very low concentrations at which free heme, its complexes and the partly unstable degradation products occur in living cells. Therefore, powerful and specific detection methods are needed. In this contribution, the potential of nondestructive Raman spectroscopy for the detection, quantification and discrimination of heme and heme degradation products is investigated. Resonance Raman spectroscopy using different excitation wavelengths (413, 476, 532, and 752 nm) is employed to estimate the limit of detection for hemin, myoglobin, biliverdin, and bilirubin. Concentrations in the low micromolar range (down to 3 μmol/L) could be reliably detected when utilizing the resonance enhancement effect. Furthermore, a systematic study on the surface-enhanced Raman spectroscopy (SERS) detection of hemin in the presence of other cellular components, such as the highly similar cytochrome c, DNA, and the important antioxidant glutathione, is presented. A microfluidic device was used to reproducibly create a segmented flow of aqueous droplets and oil compartments. Those aqueous droplets acted as model chambers where the analytes have to compete for the colloid. With the help of statistical analysis, it was possible to detect and differentiate the pure substances as well as the binary mixtures and gain insights into their interaction.

  20. Quantification of HBsAg: basic virology for clinical practice.

    PubMed

    Lee, Jung Min; Ahn, Sang Hoon

    2011-01-21

    Hepatitis B surface antigen (HBsAg) is produced and secreted through a complex mechanism that is still not fully understood. In clinical fields, HBsAg has long served as a qualitative diagnostic marker for hepatitis B virus infection. Notably, advances have been made in the development of quantitative HBsAg assays, which have allowed viral replication monitoring, and there is an opportunity to make maximal use of quantitative HBsAg to elucidate its role in clinical fields. Yet, it needs to be underscored that a further understanding of HBsAg, not only from clinical point of view but also from a virologic point of view, would enable us to deepen our insights, so that we could more widely expand and apply its utility. It is also important to be familiar with HBsAg variants and their clinical consequences in terms of immune escape mutants, issues resulting from overlap with corresponding mutation in the P gene, and detection problems for the HBsAg variants. In this article, we review current concepts and issues on the quantification of HBsAg titers with respect to their biologic nature, method principles, and clinically relevant topics.

  1. Automated quantification of one-dimensional nanostructure alignment on surfaces

    NASA Astrophysics Data System (ADS)

    Dong, Jianjin; Goldthorpe, Irene A.; Mohieddin Abukhdeir, Nasser

    2016-06-01

    A method for automated quantification of the alignment of one-dimensional (1D) nanostructures from microscopy imaging is presented. Nanostructure alignment metrics are formulated and shown to be able to rigorously quantify the orientational order of nanostructures within a two-dimensional domain (surface). A complementary image processing method is also presented which enables robust processing of microscopy images where overlapping nanostructures might be present. Scanning electron microscopy (SEM) images of nanowire-covered surfaces are analyzed using the presented methods and it is shown that past single parameter alignment metrics are insufficient for highly aligned domains. Through the use of multiple parameter alignment metrics, automated quantitative analysis of SEM images is shown to be possible and the alignment characteristics of different samples are able to be quantitatively compared using a similarity metric. The results of this work provide researchers in nanoscience and nanotechnology with a rigorous method for the determination of structure/property relationships, where alignment of 1D nanostructures is significant.

  2. A novel automated image analysis method for accurate adipocyte quantification

    PubMed Central

    Osman, Osman S; Selway, Joanne L; Kępczyńska, Małgorzata A; Stocker, Claire J; O’Dowd, Jacqueline F; Cawthorne, Michael A; Arch, Jonathan RS; Jassim, Sabah; Langlands, Kenneth

    2013-01-01

    Increased adipocyte size and number are associated with many of the adverse effects observed in metabolic disease states. While methods to quantify such changes in the adipocyte are of scientific and clinical interest, manual methods to determine adipocyte size are both laborious and intractable to large scale investigations. Moreover, existing computational methods are not fully automated. We, therefore, developed a novel automatic method to provide accurate measurements of the cross-sectional area of adipocytes in histological sections, allowing rapid high-throughput quantification of fat cell size and number. Photomicrographs of H&E-stained paraffin sections of murine gonadal adipose were transformed using standard image processing/analysis algorithms to reduce background and enhance edge-detection. This allowed the isolation of individual adipocytes from which their area could be calculated. Performance was compared with manual measurements made from the same images, in which adipocyte area was calculated from estimates of the major and minor axes of individual adipocytes. Both methods identified an increase in mean adipocyte size in a murine model of obesity, with good concordance, although the calculation used to identify cell area from manual measurements was found to consistently over-estimate cell size. Here we report an accurate method to determine adipocyte area in histological sections that provides a considerable time saving over manual methods. PMID:23991362

  3. Interactive image quantification tools in nuclear material forensics

    SciTech Connect

    Porter, Reid B; Ruggiero, Christy; Hush, Don; Harvey, Neal; Kelly, Pat; Scoggins, Wayne; Tandon, Lav

    2011-01-03

    Morphological and microstructural features visible in microscopy images of nuclear materials can give information about the processing history of a nuclear material. Extraction of these attributes currently requires a subject matter expert in both microscopy and nuclear material production processes, and is a time consuming, and at least partially manual task, often involving multiple software applications. One of the primary goals of computer vision is to find ways to extract and encode domain knowledge associated with imagery so that parts of this process can be automated. In this paper we describe a user-in-the-loop approach to the problem which attempts to both improve the efficiency of domain experts during image quantification as well as capture their domain knowledge over time. This is accomplished through a sophisticated user-monitoring system that accumulates user-computer interactions as users exploit their imagery. We provide a detailed discussion of the interactive feature extraction and segmentation tools we have developed and describe our initial results in exploiting the recorded user-computer interactions to improve user productivity over time.

  4. Mesh refinement for uncertainty quantification through model reduction

    SciTech Connect

    Li, Jing Stinis, Panos

    2015-01-01

    We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan–Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory.

  5. Experimental model for civilian ballistic brain injury biomechanics quantification.

    PubMed

    Zhang, Jiangyue; Yoganandan, Narayan; Pintar, Frank A; Guan, Yabo; Gennarelli, Thomas A

    2007-01-01

    Biomechanical quantification of projectile penetration using experimental head models can enhance the understanding of civilian ballistic brain injury and advance treatment. Two of the most commonly used handgun projectiles (25-cal, 275 m/s and 9 mm, 395 m/s) were discharged to spherical head models with gelatin and Sylgard simulants. Four ballistic pressure transducers recorded temporal pressure distributions at 308kHz, and temporal cavity dynamics were captured at 20,000 frames/second (fps) using high-speed digital video images. Pressures ranged from 644.6 to -92.8 kPa. Entry pressures in gelatin models were higher than exit pressures, whereas in Sylgard models entry pressures were lower or equivalent to exit pressures. Gelatin responded with brittle-type failure, while Sylgard demonstrated a ductile pattern through formation of micro-bubbles along projectile path. Temporary cavities in Sylgard models were 1.5-2x larger than gelatin models. Pressures in Sylgard models were more sensitive to projectile velocity and diameter increase, indicating Sylgard was more rate sensitive than gelatin. Based on failure patterns and brain tissue rate-sensitive characteristics, Sylgard was found to be an appropriate simulant. Compared with spherical projectile data, full-metal jacket (FMJ) projectiles produced different temporary cavity and pressures, demonstrating shape effects. Models using Sylgard gel and FMJ projectiles are appropriate to enhance understanding and mechanisms of ballistic brain injury. PMID:17166502

  6. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    SciTech Connect

    Khuwaileh, B.A. Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  7. Quantification of infectious duck hepatitis B virus by radioimmunofocus assay.

    PubMed

    Anderson, D A; Grgacic, E V; Luscombe, C A; Gu, X; Dixon, R

    1997-08-01

    A simple method is described for the precise quantification of infectious duck hepatitis B virus (DHBV) in cell culture, using a radioimmunofocus assay (RIFA). Primary duck hepatocyte cell cultures were infected with serial dilutions of viral samples as for a plaque assay, but then maintained with liquid overlay medium. After incubation for up to 14 days, cell monolayers were fixed with acetone, then stained with a monoclonal antibody to DHBV L protein followed by secondary antibody labelled with 125I. Foci of infection (representing individual infectious particles in the inoculum) were detected by autoradiography. The number of foci recovered was increased by addition of dimethyl sulphoxide to culture medium, but was not appreciably altered by the use of semi-solid medium. The titre of virus suspensions determined by RIFA correlated well with titration in ducklings. The RIFA is a useful method for titration of DHBV, as it has a wide dynamic range and is well suited to parallel titration of large numbers of samples. This assay will have wide use for the analysis of DHBV growth kinetics, antiviral efficacy, and virus inactivation procedures.

  8. Towards drug quantification in human skin with confocal Raman microscopy.

    PubMed

    Franzen, Lutz; Selzer, Dominik; Fluhr, Joachim W; Schaefer, Ulrich F; Windbergs, Maike

    2013-06-01

    Understanding the penetration behaviour of drugs into human skin is a prerequisite for the rational development and evaluation of effective dermal drug delivery. The general procedure for the acquisition of quantitative drug penetration profiles in human skin is performed by sequential segmentation and extraction. Unfortunately, this technique is destructive, laborious and lacks spatial resolution. Confocal Raman microscopy bares the potential of a chemically selective, label free and nondestructive analysis. However, the acquisition of quantitative drug depth profiles within skin by Raman microscopy is impeded by imponderable signal attenuation inside the tissue. In this study, we present a chemical semi-solid matrix system simulating the optical properties of human skin. This system serves as a skin surrogate for investigation of Raman signal attenuation under controlled conditions. Caffeine was homogeneously incorporated within the skin surrogate, and Raman intensity depth profiles were acquired. A mathematical algorithm describing the Raman signal attenuation within the surrogate was derived from these profiles. Human skin samples were incubated with caffeine, and Raman intensity depth profiles were similarly acquired. The surrogate algorithm was successfully applied to correct the drug profiles in human skin for signal attenuation. For the first time, a mathematical algorithm was established, which allows correction of Raman signal attenuation in human skin, thus facilitating reliable drug quantification in human skin by confocal Raman spectroscopy.

  9. Quantification of oxidative metabolism in masseter muscle of denture wearers.

    PubMed

    Fujii, A; Shinogaya, T; Toda, S; Hayakawa, I

    2005-09-01

    This study aimed to quantify oxidative metabolism in masseter muscle using near-infrared spectroscopy, in particular for denture wearers. Fourteen normal dentate subjects without malocclusion (ND group, 25-50 years) participated in the quantification of oxidative metabolism. Eleven partially edentulous patients without occlusal stops (PD group, 64-80 years) and ten edentulous patients (CD group, 57-84 years) also participated after prosthodontic treatment. Oxidative metabolism was recorded during gum chewing, maximum clenching and regulated clenching at 5 kgf. The oxygenated hemoglobin at 5 kgf clenching level was normalized to the oxygenated hemoglobin at the lowest blood flow and expressed as oxygen consumption rate (OCR). The relationship of the OCR to the maximum clenching force was analyzed using Pearson's correlation coefficient, and differences between the PD and CD groups were tested by unpaired Student's t-test. The OCR showed a significant negative correlation with maximum clenching force in the ND group. The OCR of the PD group was significantly greater than that of the CD group, although the difference in maximum clenching force was not significant between both groups. These results suggest that the aerobic ability of masseter muscle in complete denture wearers is relatively greater than in partial denture wearers with same age level.

  10. Uncertainty quantification in nanomechanical measurements using the atomic force microscope.

    PubMed

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-11

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  11. Uncertainty Quantification for Monitoring of Civil Structures from Vibration Measurements

    NASA Astrophysics Data System (ADS)

    Döhler, Michael; Mevel, Laurent

    2014-05-01

    Health Monitoring of civil structures can be performed by detecting changes in the modal parameters of a structure, or more directly in the measured vibration signals. For a continuous monitoring the excitation of a structure is usually ambient, thus unknown and assumed to be noise. Hence, all estimates from the vibration measurements are realizations of random variables with inherent uncertainty due to (unknown) process and measurement noise and finite data length. In this talk, a strategy for quantifying the uncertainties of modal parameter estimates from a subspace-based system identification approach is presented and the importance of uncertainty quantification in monitoring approaches is shown. Furthermore, a damage detection method is presented, which is based on the direct comparison of the measured vibration signals without estimating modal parameters, while taking the statistical uncertainty in the signals correctly into account. The usefulness of both strategies is illustrated on data from a progressive damage action on a prestressed concrete bridge. References E. Carden and P. Fanning. Vibration based condition monitoring: a review. Structural Health Monitoring, 3(4):355-377, 2004. M. Döhler and L. Mevel. Efficient multi-order uncertainty computation for stochastic subspace identification. Mechanical Systems and Signal Processing, 38(2):346-366, 2013. M. Döhler, L. Mevel, and F. Hille. Subspace-based damage detection under changes in the ambient excitation statistics. Mechanical Systems and Signal Processing, 45(1):207-224, 2014.

  12. Quantification of acidic compounds in complex biomass-derived streams

    SciTech Connect

    Karp, Eric M.; Nimlos, Claire T.; Deutch, Steve; Salvachúa, Davinia; Cywar, Robin M.; Beckham, Gregg T.

    2016-01-01

    Biomass-derived streams that contain acidic compounds from the degradation of lignin and polysaccharides (e.g. black liquor, pyrolysis oil, pyrolytic lignin, etc.) are chemically complex solutions prone to instability and degradation during analysis, making quantification of compounds within them challenging. Here we present a robust analytical method to quantify acidic compounds in complex biomass-derived mixtures using ion exchange, sample reconstitution in pyridine and derivatization with BSTFA. The procedure is based on an earlier method originally reported for kraft black liquors and, in this work, is applied to identify and quantify a large slate of acidic compounds in corn stover derived alkaline pretreatment liquor (APL) as a function of pretreatment severity. Analysis of the samples is conducted with GCxGC-TOFMS to achieve good resolution of the components within the complex mixture. The results reveal the dominant low molecular weight components and their concentrations as a function of pretreatment severity. Application of this method is also demonstrated in the context of lignin conversion technologies by applying it to track the microbial conversion of an APL substrate. Here too excellent results are achieved, and the appearance and disappearance of compounds is observed in agreement with the known metabolic pathways of two bacteria, indicating the sample integrity was maintained throughout analysis. Finally, it is shown that this method applies more generally to lignin-rich materials by demonstrating its usefulness in analysis of pyrolysis oil and pyrolytic lignin.

  13. Quantification and prediction of rare events in nonlinear waves

    NASA Astrophysics Data System (ADS)

    Sapsis, Themistoklis; Cousins, Will; Mohamad, Mustafa

    2014-11-01

    The scope of this work is the quantification and prediction of rare events characterized by extreme intensity, in nonlinear dispersive models that simulate water waves. In particular we are interested for the understanding and the short-term prediction of rogue waves in the ocean and to this end, we consider 1-dimensional nonlinear models of the NLS type. To understand the energy transfers that occur during the development of an extreme event we perform a spatially localized analysis of the energy distribution along different wavenumbers by means of the Gabor transform. A stochastic analysis of the Gabor coefficients reveals i) the low-dimensionality of the intermittent structures, ii) the interplay between non-Gaussian statistical properties and nonlinear energy transfers between modes, as well as iii) the critical scales (or Gabor coefficients) where a critical energy can trigger the formation of an extreme event. The unstable character of these critical localized modes is analysed directly through the system equation and it is shown that it is defined as the result of the system nonlinearity and the wave dissipation (that mimics wave breaking). These unstable modes are randomly triggered through the dispersive ``heat bath'' of random waves that propagate in the nonlinear medium. Using these properties we formulate low-dimensional functionals of these Gabor coefficients that allow for the prediction of extreme event well before the strongly nonlinear interactions begin to occur. The prediction window is further enhanced by the combination of the developed scheme with traditional filtering schemes.

  14. Uncertainty quantification for large-scale ocean circulation predictions.

    SciTech Connect

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2010-09-01

    Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

  15. Quantification of Nociceptive Escape Response in C.elegans

    NASA Astrophysics Data System (ADS)

    Leung, Kawai; Mohammadi, Aylia; Ryu, William; Nemenman, Ilya

    2013-03-01

    Animals cannot rank and communicate their pain consciously. Thus in pain studies on animal models, one must infer the pain level from high precision experimental characterization of behavior. This is not trivial since behaviors are very complex and multidimensional. Here we explore the feasibility of C.elegans as a model for pain transduction. The nematode has a robust neurally mediated noxious escape response, which we show to be partially decoupled from other sensory behaviors. We develop a nociceptive behavioral response assay that allows us to apply controlled levels of pain by locally heating worms with an IR laser. The worms' motions are captured by machine vision programming with high spatiotemporal resolution. The resulting behavioral quantification allows us to build a statistical model for inference of the experienced pain level from the behavioral response. Based on the measured nociceptive escape of over 400 worms, we conclude that none of the simple characteristics of the response are reliable indicators of the laser pulse strength. Nonetheless, a more reliable statistical inference of the pain stimulus level from the measured behavior is possible based on a complexity-controlled regression model that takes into account the entire worm behavioral output. This work was partially supported by NSF grant No. IOS/1208126 and HFSP grant No. RGY0084/2011.

  16. Bioluminescence regenerative cycle (BRC) system for nucleic acid quantification assays

    NASA Astrophysics Data System (ADS)

    Hassibi, Arjang; Lee, Thomas H.; Davis, Ronald W.; Pourmand, Nader

    2003-07-01

    A new label-free methodology for nucleic acid quantification has been developed where the number of pyrophosphate molecules (PPi) released during polymerization of the target nucleic acid is counted and correlated to DNA copy number. The technique uses the enzymatic complex of ATP-sulfurylase and firefly luciferase to generate photons from PPi. An enzymatic unity gain positive feedback is also implemented to regenerate the photon generation process and compensate any decay in light intensity by self regulation. Due to this positive feedback, the total number of photons generated by the bioluminescence regenerative cycle (BRC) can potentially be orders of magnitude higher than typical chemiluminescent processes. A system level kinetic model that incorporates the effects of contaminations and detector noise was used to show that the photon generation process is in fact steady and also proportional to the nucleic acid quantity. Here we show that BRC is capable of detecting quantities of DNA as low as 1 amol (10-18 mole) in 40μlit aqueous solutions, and this enzymatic assay has a controllable dynamic range of 5 orders of magnitude. The sensitivity of this technology, due to the excess number of photons generated by the regenerative cycle, is not constrained by detector performance, but rather by possible PPi or ATP (adenosine triphosphate) contamination, or background bioluminescence of the enzymatic complex.

  17. A posteriori uncertainty quantification of PIV-based pressure data

    NASA Astrophysics Data System (ADS)

    Azijli, Iliass; Sciacchitano, Andrea; Ragni, Daniele; Palha, Artur; Dwight, Richard P.

    2016-05-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from the prior distribution (prior knowledge of properties of the velocity field, e.g., divergence-free) and the statistical model of PIV measurement uncertainty. Once the posterior covariance matrix of the velocity is known, it is propagated through the discretized Poisson equation for pressure. Numerical assessment of the proposed method on a steady Lamb-Oseen vortex shows excellent agreement with Monte Carlo simulations, while linear uncertainty propagation underestimates the uncertainty in the pressure by up to 30 %. The method is finally applied to an experimental test case of a turbulent boundary layer in air, obtained using time-resolved tomographic PIV. Simultaneously with the PIV measurements, microphone measurements were carried out at the wall. The pressure reconstructed from the tomographic PIV data is compared to the microphone measurements. Realizing that the uncertainty of the latter is significantly smaller than the PIV-based pressure, this allows us to obtain an estimate for the true error of the former. The comparison between true error and estimated uncertainty demonstrates the accuracy of the uncertainty estimates on the pressure. In addition, enforcing the divergence-free constraint is found to result in a significantly more accurate reconstructed pressure field. The estimated uncertainty confirms this result.

  18. Micelle Mediated Trace Level Sulfide Quantification through Cloud Point Extraction

    PubMed Central

    Devaramani, Samrat; Malingappa, Pandurangappa

    2012-01-01

    A simple cloud point extraction protocol has been proposed for the quantification of sulfide at trace level. The method is based on the reduction of iron (III) to iron (II) by the sulfide and the subsequent complexation of metal ion with nitroso-R salt in alkaline medium. The resulting green-colored complex was extracted through cloud point formation using cationic surfactant, that is, cetylpyridinium chloride, and the obtained surfactant phase was homogenized by ethanol before its absorbance measurement at 710 nm. The reaction variables like metal ion, ligand, surfactant concentration, and medium pH on the cloud point extraction of the metal-ligand complex have been optimized. The interference effect of the common anions and cations was studied. The proposed method has been successfully applied to quantify the trace level sulfide in the leachate samples of the landfill and water samples from bore wells and ponds. The validity of the proposed method has been studied by spiking the samples with known quantities of sulfide as well as comparing with the results obtained by the standard method. PMID:22619597

  19. Direct field method for root biomass quantification in agroecosystems.

    PubMed

    Frasier, Ileana; Noellemeyer, Elke; Fernández, Romina; Quiroga, Alberto

    2016-01-01

    The present article describes a field auger sampling method for row-crop root measurements. In agroecosystems where crops are planted in a specific design (row crops), sampling procedures for root biomass quantification need to consider the spatial variability of the root system. This article explains in detail how to sample and calculate root biomass considering the sampling position in the field and the differential weight of the root biomass in the inter-row compared to the crop row when expressing data per area unit. This method is highly reproducible in the field and requires no expensive equipment and/or special skills. It proposes to use a narrow auger thus reducing field labor with less destructive sampling, and decreases laboratory time because samples are smaller. The small sample size also facilitates the washing and root separation with tweezers. This method is suitable for either winter- or summer crop roots. •Description of a direct field method for row-crop root measurements.•Description of data calculation for total root-biomass estimation per unit area.•The proposed method is simple, less labor- and less time consuming. PMID:27630821

  20. Current analytical methods for plant auxin quantification--A review.

    PubMed

    Porfírio, Sara; Gomes da Silva, Marco D R; Peixe, Augusto; Cabrita, Maria J; Azadi, Parastoo

    2016-01-01

    Plant hormones, and especially auxins, are low molecular weight compounds highly involved in the control of plant growth and development. Auxins are also broadly used in horticulture, as part of vegetative plant propagation protocols, allowing the cloning of genotypes of interest. Over the years, large efforts have been put in the development of more sensitive and precise methods of analysis and quantification of plant hormone levels in plant tissues. Although analytical techniques have evolved, and new methods have been implemented, sample preparation is still the limiting step of auxin analysis. In this review, the current methods of auxin analysis are discussed. Sample preparation procedures, including extraction, purification and derivatization, are reviewed and compared. The different analytical techniques, ranging from chromatographic and mass spectrometry methods to immunoassays and electrokinetic methods, as well as other types of detection are also discussed. Considering that auxin analysis mirrors the evolution in analytical chemistry, the number of publications describing new and/or improved methods is always increasing and we considered appropriate to update the available information. For that reason, this article aims to review the current advances in auxin analysis, and thus only reports from the past 15 years will be covered.

  1. Quantification of collagen contraction in three-dimensional cell culture.

    PubMed

    Kopanska, Katarzyna S; Bussonnier, Matthias; Geraldo, Sara; Simon, Anthony; Vignjevic, Danijela; Betz, Timo

    2015-01-01

    Many different cell types including fibroblasts, smooth muscle cells, endothelial cells, and cancer cells exert traction forces on the fibrous components of the extracellular matrix. This can be observed as matrix contraction both macro- and microscopically in three-dimensional (3D) tissues models such as collagen type I gels. The quantification of local contraction at the micron scale, including its directionality and speed, in correlation with other parameters such as cell invasion, local protein or gene expression, can provide useful information to study wound healing, organism development, and cancer metastasis. In this article, we present a set of tools to quantify the flow dynamics of collagen contraction, induced by cells migrating out of a multicellular cancer spheroid into a three-dimensional (3D) collagen matrix. We adapted a pseudo-speckle technique that can be applied to bright-field and fluorescent microscopy time series. The image analysis presented here is based on an in-house written software developed in the Matlab (Mathworks) programming environment. The analysis program is freely available from GitHub following the link: http://dx.doi.org/10.5281/zenodo.10116. This tool provides an automatized technique to measure collagen contraction that can be utilized in different 3D cellular systems.

  2. Quantification of sugars in breakfast cereals using capillary electrophoresis.

    PubMed

    Toutounji, Michelle R; Van Leeuwen, Matthew P; Oliver, James D; Shrestha, Ashok K; Castignolles, Patrice; Gaborieau, Marianne

    2015-05-18

    About 80% of the Australian population consumes breakfast cereal (BC) at least five days a week. With high prevalence rates of obesity and other diet-related diseases, improved methods for monitoring sugar levels in breakfast cereals would be useful in nutrition research. The heterogeneity of the complex matrix of BCs can make carbohydrate analysis challenging or necessitate tedious sample preparation leading to potential sugar loss or starch degradation into sugars. A recently established, simple and robust free solution capillary electrophoresis (CE) method was used in a new application to 13 BCs (in Australia) and compared with several established methods for quantification of carbohydrates. Carbohydrates identified in BCs by CE included sucrose, maltose, glucose and fructose. The CE method is simple requiring no sample preparation or derivatization and carbohydrates are detected by direct UV detection. CE was shown to be a more robust and accurate method for measuring carbohydrates than Fehling method, DNS (3,5-dinitrosalicylic acid) assay and HPLC (high performance liquid chromatography).

  3. An experimental database for evaluating PIV uncertainty quantification methods

    NASA Astrophysics Data System (ADS)

    Warner, Scott; Neal, Douglas; Sciacchitano, Andrea

    2014-11-01

    Uncertainty quantification for particle image velocimetry (PIV) data has recently become a topic of great interest as shown by the publishing of several different methods within the past few years. A unique experiment has been designed to test the efficacy of PIV uncertainty methods, using a rectangular jet as the flow field. The novel aspect of the experimental setup consists of simultaneous measurements by means of two different time-resolved PIV systems and a hot-wire anemometer (HWA). The first PIV system, called the ``PIV-Measurement'' system, collects the data for which uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of many PIV experiments. The second PIV system, called the ``PIV-HDR'' (high dynamic range) system, has a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire was placed in close proximity to the PIV measurement domain. All three of the measurement systems were carefully set to simultaneously collect time-resolved data on a point-by-point basis. The HWA validates the PIV-HDR system as the reference velocity so that it can be used to evaluate the instantaneous error in the PIV-measurement system.

  4. Quantification of hydroxyacetone and glycolaldehyde using chemical ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    St. Clair, J. M.; Spencer, K. M.; Beaver, M. R.; Crounse, J. D.; Paulot, F.; Wennberg, P. O.

    2014-04-01

    Chemical ionization mass spectrometry (CIMS) enables online, rapid, in situ detection and quantification of hydroxyacetone and glycolaldehyde. Two different CIMS approaches are demonstrated employing the strengths of single quadrupole mass spectrometry and triple quadrupole (tandem) mass spectrometry. Both methods are generally capable of the measurement of hydroxyacetone, an analyte with known but minimal isobaric interferences. Tandem mass spectrometry provides direct separation of the isobaric compounds glycolaldehyde and acetic acid using distinct, collision-induced dissociation daughter ions. The single quadrupole CIMS measurement of glycolaldehyde was demonstrated during the ARCTAS-CARB (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites - California Air Resources Board) 2008 campaign, while triple quadrupole CIMS measurements of glycolaldehyde and hydroxyacetone were demonstrated during the BEARPEX (Biosphere Effects on Aerosols and Photochemistry Experiment) 2009 campaign. Enhancement ratios of glycolaldehyde in ambient biomass-burning plumes are reported for the ARCTAS-CARB campaign. BEARPEX observations are compared to simple photochemical box model predictions of biogenic volatile organic compound oxidation at the site.

  5. Quantification of brodifacoum in plasma and liver tissue by HPLC.

    PubMed

    O'Bryan, S M; Constable, D J

    1991-01-01

    A simple high-performance liquid chromatographic method has been developed for detection and quantification of brodifacoum in plasma and liver tissue. After adding difenacoum as the internal standard, brodifacoum and difenacoum are extracted from 2 mL of plasma with two sequential 10-mL volumes of acetonitrile-ethyl ether (9:1) and from 2 g of liver tissue by grinding the tissue with 10 mL acetonitrile. The extracts are evaporated to dryness under nitrogen, 2 mL of acetonitrile is added to reconstitute the residues, and the resulting solution is analyzed using reversed-phase chromatography and fluorescence detection. The limits of detection for plasma and tissue are 2 micrograms/L and 5 ng/g, respectively. Using internal standardization, the mean intra-assay recovery from plasma is 92% and the mean inter-assay recoveries is 109%. The mean intra-assay and inter-assay recoveries from tissue are 96%. No interferences were observed with any of the following related compounds: brodifacoum, bromadiolone, coumarin, difenacoum, diphacinone, warfarin, and vitamin K1. PMID:1943058

  6. VESGEN Software for Mapping and Quantification of Vascular Regulators

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.

    2012-01-01

    VESsel GENeration (VESGEN) Analysis is an automated software that maps and quantifies effects of vascular regulators on vascular morphology by analyzing important vessel parameters. Quantification parameters include vessel diameter, length, branch points, density, and fractal dimension. For vascular trees, measurements are reported as dependent functions of vessel branching generation. VESGEN maps and quantifies vascular morphological events according to fractal-based vascular branching generation. It also relies on careful imaging of branching and networked vascular form. It was developed as a plug-in for ImageJ (National Institutes of Health, USA). VESGEN uses image-processing concepts of 8-neighbor pixel connectivity, skeleton, and distance map to analyze 2D, black-and-white (binary) images of vascular trees, networks, and tree-network composites. VESGEN maps typically 5 to 12 (or more) generations of vascular branching, starting from a single parent vessel. These generations are tracked and measured for critical vascular parameters that include vessel diameter, length, density and number, and tortuosity per branching generation. The effects of vascular therapeutics and regulators on vascular morphology and branching tested in human clinical or laboratory animal experimental studies are quantified by comparing vascular parameters with control groups. VESGEN provides a user interface to both guide and allow control over the users vascular analysis process. An option is provided to select a morphological tissue type of vascular trees, network or tree-network composites, which determines the general collections of algorithms, intermediate images, and output images and measurements that will be produced.

  7. Effect of heat processing on DNA quantification of meat species.

    PubMed

    Şakalar, Ergün; Abasiyanik, M Fatih; Bektik, Emre; Tayyrov, Annageldi

    2012-09-01

    In this study, real-time polymerase chain reaction (PCR) was used for identifying the effects of different temperatures and times of heat treatment on the DNA of meat products. For this purpose, beef, pork, and chicken were baked at 200 °C for 10, 20, 30, 40, 50 min, and for 30 min at 30, 60, 90, 120, 150, 180, 210 °C and also cooked by boiling at 99 °C for 10, 30, 60, 90, 120, 150, 180, 210, and 240 min. The DNA was then extracted from all samples after the heat treatment. Further, a region of 374, 290, and 183-bp of mitochondrial DNA of beef, pork, and chicken, respectively, was amplified by real-time PCR. It was found that baking and boiling of the beef, pork, and chicken resulted in decreases in the detectable copy numbers of specific genes, which varied with the heating time and degree. The results indicated that species determination and quantification using real-time PCR are affected by the temperature, duration of the heat treatment, and size of the DNA fragment to be amplified.

  8. Detection and quantification of MS lesions using fuzzy topological principles

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.

    1996-04-01

    Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.

  9. A simplified diphenylamine colorimetric method for growth quantification.

    PubMed

    Zhao, Youbao; Xiang, Sihai; Dai, Xida; Yang, Keqian

    2013-06-01

    Cell growth needs to be monitored in biological studies and bioprocess optimization. In special circumstances, such as microbial fermentations in media containing insoluble particles, accurate cell growth quantification is a challenge with current methods. Only the Burton method is applicable in such circumstances. The original Burton method was previously simplified by adopting a two-step sample pretreatment in perchloric acid procedure to eliminate the need for DNA extraction. Here, we further simplified the Burton method by replacing the previous two-step perchloric acid pretreatment with a new and one-step diphenylamine reagent pretreatment. The reliability and accuracy of this simplified method were assessed by measuring the biomass of four model microorganisms: Escherichia coli, Streptomyces clavuligerus, Saccharomyces cerevisiae, and Trichoderma reesei grown in normal media or those containing solid particles. The results demonstrate that this new simplified method performs comparably to the conventional methods, such as OD600 or the previously modified Burton method, and is much more sensitive than the dry weight method. Overall, the new method is simple, reliable, easy to perform, and generally applicable in most circumstances, and it reduces the operation time from more than 12 h (for the previously simplified Burton method) to about 2 h.

  10. Quantification of Internalized Silica Nanoparticles via STED Microscopy

    PubMed Central

    Peuschel, Henrike; Ruckelshausen, Thomas; Cavelius, Christian; Kraegeloh, Annette

    2015-01-01

    The development of safe engineered nanoparticles (NPs) requires a detailed understanding of their interaction mechanisms on a cellular level. Therefore, quantification of NP internalization is crucial to predict the potential impact of intracellular NP doses, providing essential information for risk assessment as well as for drug delivery applications. In this study, the internalization of 25 nm and 85 nm silica nanoparticles (SNPs) in alveolar type II cells (A549) was quantified by application of super-resolution STED (stimulated emission depletion) microscopy. Cells were exposed to equal particle number concentrations (9.2 × 1010 particles mL−1) of each particle size and the sedimentation of particles during exposure was taken into account. Microscopy images revealed that particles of both sizes entered the cells after 5 h incubation in serum supplemented and serum-free medium. According to the in vitro sedimentation, diffusion, and dosimetry (ISDD) model 20–27% of the particles sedimented. In comparison, 102-103 NPs per cell were detected intracellularly serum-containing medium. Furthermore, in the presence of serum, no cytotoxicity was induced by the SNPs. In serum-free medium, large agglomerates of both particle sizes covered the cells whereas only high concentrations (≥ 3.8 × 1012 particles mL−1) of the smaller particles induced cytotoxicity. PMID:26125028

  11. Preparation, imaging, and quantification of bacterial surface motility assays.

    PubMed

    Morales-Soto, Nydia; Anyan, Morgen E; Mattingly, Anne E; Madukoma, Chinedu S; Harvey, Cameron W; Alber, Mark; Déziel, Eric; Kearns, Daniel B; Shrout, Joshua D

    2015-01-01

    Bacterial surface motility, such as swarming, is commonly examined in the laboratory using plate assays that necessitate specific concentrations of agar and sometimes inclusion of specific nutrients in the growth medium. The preparation of such explicit media and surface growth conditions serves to provide the favorable conditions that allow not just bacterial growth but coordinated motility of bacteria over these surfaces within thin liquid films. Reproducibility of swarm plate and other surface motility plate assays can be a major challenge. Especially for more "temperate swarmers" that exhibit motility only within agar ranges of 0.4%-0.8% (wt/vol), minor changes in protocol or laboratory environment can greatly influence swarm assay results. "Wettability", or water content at the liquid-solid-air interface of these plate assays, is often a key variable to be controlled. An additional challenge in assessing swarming is how to quantify observed differences between any two (or more) experiments. Here we detail a versatile two-phase protocol to prepare and image swarm assays. We include guidelines to circumvent the challenges commonly associated with swarm assay media preparation and quantification of data from these assays. We specifically demonstrate our method using bacteria that express fluorescent or bioluminescent genetic reporters like green fluorescent protein (GFP), luciferase (lux operon), or cellular stains to enable time-lapse optical imaging. We further demonstrate the ability of our method to track competing swarming species in the same experiment. PMID:25938934

  12. Automated quantification of pancreatic β-cell mass.

    PubMed

    Golson, Maria L; Bush, William S; Brissova, Marcela

    2014-06-15

    β-Cell mass is a parameter commonly measured in studies of islet biology and diabetes. However, the rigorous quantification of pancreatic β-cell mass using conventional histological methods is a time-consuming process. Rapidly evolving virtual slide technology with high-resolution slide scanners and newly developed image analysis tools has the potential to transform β-cell mass measurement. To test the effectiveness and accuracy of this new approach, we assessed pancreata from normal C57Bl/6J mice and from mouse models of β-cell ablation (streptozotocin-treated mice) and β-cell hyperplasia (leptin-deficient mice), using a standardized systematic sampling of pancreatic specimens. Our data indicate that automated analysis of virtual pancreatic slides is highly reliable and yields results consistent with those obtained by conventional morphometric analysis. This new methodology will allow investigators to dramatically reduce the time required for β-cell mass measurement by automating high-resolution image capture and analysis of entire pancreatic sections. PMID:24760991

  13. Interactive image quantification tools in nuclear material forensics

    NASA Astrophysics Data System (ADS)

    Porter, Reid; Ruggiero, Christy; Hush, Don; Harvey, Neal; Kelly, Patrick; Scoggins, Wayne; Tandon, Lav

    2011-03-01

    Morphological and microstructural features visible in microscopy images of nuclear materials can give information about the processing history of a nuclear material. Extraction of these attributes currently requires a subject matter expert in both microscopy and nuclear material production processes, and is a time consuming, and at least partially manual task, often involving multiple software applications. One of the primary goals of computer vision is to find ways to extract and encode domain knowledge associated with imagery so that parts of this process can be automated. In this paper we describe a user-in-the-loop approach to the problem which attempts to both improve the efficiency of domain experts during image quantification as well as capture their domain knowledge over time. This is accomplished through a sophisticated user-monitoring system that accumulates user-computer interactions as users exploit their imagery. We provide a detailed discussion of the interactive feature extraction and segmentation tools we have developed and describe our initial results in exploiting the recorded user-computer interactions to improve user productivity over time.

  14. Quantification of the degree of reaction of fly ash

    SciTech Connect

    Ben Haha, M.; De Weerdt, K.; Lothenbach, B.

    2010-11-15

    The quantification of the fly ash (FA) in FA blended cements is an important parameter to understand the effect of the fly ash on the hydration of OPC and on the microstructural development. The FA reaction in two different blended OPC-FA systems was studied using a selective dissolution technique based on EDTA/NaOH, diluted NaOH solution, the portlandite content and by backscattered electron image analysis. The amount of FA determined by selective dissolution using EDTA/NaOH is found to be associated with a significant possible error as different assumptions lead to large differences in the estimate of FA reacted. In addition, at longer hydration times, the reaction of the FA is underestimated by this method due to the presence of non-dissolved hydrates and MgO rich particles. The dissolution of FA in diluted NaOH solution agreed during the first days well with the dissolution as observed by image analysis. At 28 days and longer, the formation of hydrates in the diluted solutions leads to an underestimation. Image analysis appears to give consistent results and to be most reliable technique studied.

  15. Impact Induced Delamination Detection and Quantification With Guided Wavefield Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara A. C.; Yu, Lingyu; Seebo, Jeffrey P.

    2015-01-01

    This paper studies impact induced delamination detection and quantification by using guided wavefield data and spatial wavenumber imaging. The complex geometry impact-like delamination is created through a quasi-static indentation on a CFRP plate. To detect and quantify the impact delamination in the CFRP plate, PZT-SLDV sensing and spatial wavenumber imaging are performed. In the PZT-SLDV sensing, the guided waves are generated from the PZT, and the high spatial resolution guided wavefields are measured by the SLDV. The guided wavefield data acquired from the PZT-SLDV sensing represent guided wave propagation in the composite laminate and include guided wave interaction with the delamination damage. The measured guided wavefields are analyzed through the spatial wavenumber imaging method, which generates an image containing the dominant local wavenumber at each spatial location. The spatial wavenumber imaging result for the simple single layer Teflon insert delamination provided quantitative information on delamination damage size and location. The location of delamination damage is indicated by the area with larger wavenumbers in the spatial wavenumber image. The impact-like delamination results only partially agreed with the damage size and shape. The results also demonstrated the dependence on excitation frequency. Future work will further investigate the accuracy of the wavenumber imaging method for real composite damage and the dependence on frequency of excitation.

  16. Toward automated quantification of biological microstructures using unbiased stereology

    NASA Astrophysics Data System (ADS)

    Bonam, Om P.; Elozory, Daniel; Kramer, Kurt; Goldgof, Dmitry; Hall, Lawrence O.; Mangual, Osvaldo; Mouton, Peter R.

    2011-03-01

    Quantitative analysis of biological microstructures using unbiased stereology plays a large and growing role in bioscience research. Our aim is to add a fully automatic, high-throughput mode to a commercially available, computerized stereology device (Stereologer). The current method for estimation of first- and second order parameters of biological microstructures, requires a trained user to manually select objects of interest (cells, fibers etc.,) while stepping through the depth of a stained tissue section in fixed intervals. The proposed approach uses a combination of color and gray-level processing. Color processing is used to identify the objects of interest, by training on the images to obtain the threshold range for objects of interest. In gray-level processing, a region-growing approach was used to assign a unique identity to the objects of interest and enumerate them. This automatic approach achieved an overall object detection rate of 93.27%. Thus, these results support the view that automatic color and gray-level processing combined with unbiased sampling and assumption and model-free geometric probes can provide accurate and efficient quantification of biological objects.

  17. Rapid method for the quantification of hydroquinone concentration: chemiluminescent analysis.

    PubMed

    Chen, Tung-Sheng; Liou, Show-Yih; Kuo, Wei-Wen; Wu, Hsi-Chin; Jong, Gwo-Ping; Wang, Hsueh-Fang; Shen, Chia-Yao; Padma, V Vijaya; Huang, Chih-Yang; Chang, Yen-Lin

    2015-11-01

    Topical hydroquinone serves as a skin whitener and is usually available in cosmetics or on prescription based on the hydroquinone concentration. Quantification of hydroquinone content therefore becomes an important issue in topical agents. High-performance liquid chromatography (HPLC) is the commonest method for determining hydroquinone content in topical agents, but this method is time-consuming and uses many solvents that can become an environmental issue. We report a rapid method for quantifying hydroquinone content by chemiluminescent analysis. Hydroquinone induces the production of hydrogen peroxide in the presence of basic compounds. Hydrogen peroxide induced by hydroquinone oxidized light-emitting materials such as lucigenin, resulted in the production of ultra-weak chemiluminescence that was detected by a chemiluminescence analyzer. The intensity of the chemiluminescence was found to be proportional to the hydroquinone concentration. We suggest that the rapid (measurement time, 60 s) and virtually solvent-free (solvent volume, <2 mL) chemiluminescent method described here for quantifying hydroquinone content may be an alternative to HPLC analysis. PMID:25693839

  18. Uncertainty Quantification applied to flow simulations in thoracic aortic aneurysms

    NASA Astrophysics Data System (ADS)

    Boccadifuoco, Alessandro; Mariotti, Alessandro; Celi, Simona; Martini, Nicola; Salvetti, Maria Vittoria

    2015-11-01

    The thoracic aortic aneurysm is a progressive dilatation of the thoracic aorta causing a weakness in the aortic wall, which may eventually cause life-threatening events. Clinical decisions on treatment strategies are currently based on empiric criteria, like the aortic diameter value or its growth rate. Numerical simulations can give the quantification of important indexes which are impossible to be obtained through in-vivo measurements and can provide supplementary information. Hemodynamic simulations are carried out by using the open-source tool SimVascular and considering patient-specific geometries. One of the main issues in these simulations is the choice of suitable boundary conditions, modeling the organs and vessels not included in the computational domain. The current practice is to use outflow conditions based on resistance and capacitance, whose values are tuned to obtain a physiological behavior of the patient pressure. However it is not known a priori how this choice affects the results of the simulation. The impact of the uncertainties in these outflow parameters is investigated here by using the generalized Polynomial Chaos approach. This analysis also permits to calibrate the outflow-boundary parameters when patient-specific in-vivo data are available.

  19. Automated angiogenesis quantification through advanced image processing techniques.

    PubMed

    Doukas, Charlampos N; Maglogiannis, Ilias; Chatziioannou, Aristotle; Papapetropoulos, Andreas

    2006-01-01

    Angiogenesis, the formation of blood vessels in tumors, is an interactive process between tumor, endothelial and stromal cells in order to create a network for oxygen and nutrients supply, necessary for tumor growth. According to this, angiogenic activity is considered a suitable method for both tumor growth or inhibition detection. The angiogenic potential is usually estimated by counting the number of blood vessels in particular sections. One of the most popular assay tissues to study the angiogenesis phenomenon is the developing chick embryo and its chorioallantoic membrane (CAM), which is a highly vascular structure lining the inner surface of the egg shell. The aim of this study was to develop and validate an automated image analysis method that would give an unbiased quantification of the micro-vessel density and growth in angiogenic CAM images. The presented method has been validated by comparing automated results to manual counts over a series of digital chick embryo photos. The results indicate the high accuracy of the tool, which has been thus extensively used for tumor growth detection at different stages of embryonic development. PMID:17946107

  20. Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites

    NASA Astrophysics Data System (ADS)

    Borkowski, Luke; Chattopadhyay, Aditi

    2014-03-01

    Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.

  1. Fluorescent quantification of size and lamellarity of membrane nanotubes.

    PubMed

    Baroji, Younes F; Oddershede, Lene B; Seyed Reihani, Seyed Nader; Bendix, Poul M

    2014-12-01

    Membrane nanotubes, ubiquitous in cellular systems, adopt a spectrum of curvatures and shapes that are dictated by their intrinsic physical characteristics as well as their interactions with the local cellular environment. A high bending flexibility is needed in the crowded cytoplasm where tubes often need to bend significantly in the axial direction at sub-micron length scales. We find the stiffness of spontaneously formed membrane nanotubes by measuring the persistence length of reconstituted membrane nanotubes freely suspended in solution and imaged by fluorescence microscopy. By quantifying the tube diameter we demonstrate for the first time that the persistence length scales linearly with radius. Although most tubes are uni-lamellar, the predicted linear scaling between tube radius and persistence length allows us to identify tubes that spontaneously form as multilamellar structures upon hydration. We provide the first experimental evidence that illumination of lipid fluorophores can have a profound effect on the lipid bilayer which we sensitively detect as a continuous change in the tube persistence length with time. The novel assay and methodology here presented has potential for quantification of the structural reinforcement of membrane tubes by scaffolding proteins.

  2. [Real time PCR methodology for quantification of nucleic acids].

    PubMed

    Tse, C; Capeau, J

    2003-01-01

    The polymerase chain reaction (PCR) has become an essential tool for molecular biologists and its introduction into nucleic acids detection systems has revolutionized the quantitative analysis of DNA and RNA. The technique has rapidly evolved over the last few years and the growing interest in quantitative applications of the PCR has favoured the development of real-time quantitative PCR. In this paper, we review, after presentation of the theorical aspects of PCR, the basic principles of real-time PCR with the introduction of the concept of threshold cycle. More precisely, we describe the novel assay formats that greatly simplify the protocols used for the detection of specific nucleic acids. We focus on the actual four technologies that enable sequence detection in a closed tube and that are SYBR Green I, TaqMan probes, Hybridization probes and Molecular Beacon probes. We then discuss the different quantification strategies in real time PCR and compare the competiting instruments on the market. The most important real-time PCR applications in clinical biology are also described.

  3. Atomic Resolution Imaging and Quantification of Chemical Functionality of Surfaces

    SciTech Connect

    Schwarz, Udo

    2014-12-10

    The work carried out from 2006-2014 under DoE support was targeted at developing new approaches to the atomic-scale characterization of surfaces that include species-selective imaging and an ability to quantify chemical surface interactions with site-specific accuracy. The newly established methods were subsequently applied to gain insight into the local chemical interactions that govern the catalytic properties of model catalysts of interest to DoE. The foundation of our work was the development of three-dimensional atomic force microscopy (3D-AFM), a new measurement mode that allows the mapping of the complete surface force and energy fields with picometer resolution in space (x, y, and z) and piconewton/millielectron volts in force/energy. From this experimental platform, we further expanded by adding the simultaneous recording of tunneling current (3D-AFM/STM) using chemically well-defined tips. Through comparison with simulations, we were able to achieve precise quantification and assignment of local chemical interactions to exact positions within the lattice. During the course of the project, the novel techniques were applied to surface-oxidized copper, titanium dioxide, and silicon oxide. On these materials, defect-induced changes to the chemical surface reactivity and electronic charge density were characterized with site-specific accuracy.

  4. An Uncertainty Quantification System for Tabular Equations of State

    NASA Astrophysics Data System (ADS)

    Carpenter, John; Robinson, Allen; Debusschere, Bert; Mattsson, Ann; Drake, Richard; Rider, William

    2013-06-01

    Providing analysts with information regarding the accuracy of computational models is key for enabling predictive design and engineering. Uncertainty in material models can make significant contributions to the overall uncertainty in calculations. As a first step toward tackling this large problem, we present an uncertainty quantification system for tabular equations of state (EOS). First a posterior distribution of EOS model parameters is inferred using Bayes rule and a set of experimental and computational data. EOS tables are generated for parameter states sampled from the posterior distribution. A new unstructured triangular table format allows for capturing multi-phase model behavior. A principal component analysis then reduces this set of tables to a mean table and most significant perturbations. This final set of tables is provided to hydrocodes for performing simulations using standard non-intrusive uncertainty propagation methods. A multi-phase aluminum model is used to demonstrate the system. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  5. Impact induced delamination detection and quantification with guided wavefield analysis

    NASA Astrophysics Data System (ADS)

    Tian, Zhenhua; Leckey, Cara A. C.; Yu, Lingyu; Seebo, Jeffrey P.

    2015-04-01

    This paper studies impact induced delamination detection and quantification methods via guided wavefield data and spatial wavenumber imaging. In this study, the complex geometry impact-like delamination damage in a composite laminate is created through the quasi-static indention technique. To detect and quantify the delamination damage, the guided ultrasonic waves are excited through a piezoelectric actuator, and the guided wavefields are measured by a scanning laser Doppler vibrometer. The acquired guided wavefields contain a wealth of information regarding the wave propagation in the composite plate and complex wave interaction at the delamination region. To process the wavefield data and evaluate the delamination damage, the measured wavefields are analyzed through the spatial wavenumber imaging method which can generate an image containing the dominant local wavenumber at each spatial location. For a proof of concept, the approach is first applied to a single Teflon insert simulating a delamination, and then to the complex geometry impact-like delamination damage. The results show that the spatial wavenumber imaging can not only determine the delamination location, but also provide quantitative information regarding the delamination size and shape. The detection results for the impact induced delamination are compared to an ultrasonic C-scan image and wavenumber images are studied for two different excitation frequencies. Fairly good agreement is observed for portions of the delamination, and differences in wavenumber are observed at the two different frequencies. Results demonstrate that the spatial wavenumber imaging is a promising technique for yielding delamination location and size information.

  6. Automated core-penumbra quantification in neonatal ischemic brain injury.

    PubMed

    Ghosh, Nirmalya; Yuan, Xiangpeng; Turenius, Christine I; Tone, Beatriz; Ambadipudi, Kamalakar; Snyder, Evan Y; Obenaus, Andre; Ashwal, Stephen

    2012-12-01

    Neonatal hypoxic-ischemic brain injury (HII) and arterial ischemic stroke (AIS) result in irreversibly injured (core) and salvageable (penumbral) tissue regions. Identification and reliable quantification of salvageable tissue is pivotal to any effective and safe intervention. Magnetic resonance imaging (MRI) is the current standard to distinguish core from penumbra using diffusion-perfusion mismatch (DPM). However, subtle MR signal variations between core-penumbral regions make their visual delineation difficult. We hypothesized that computational analysis of MRI data provides a more accurate assessment of core and penumbral tissue evolution in HII/AIS. We used two neonatal rat-pup models of HII/AIS (unilateral and global hypoxic-ischemia) and clinical data sets from neonates with AIS to test our noninvasive, automated computational approach, Hierarchical Region Splitting (HRS), to detect and quantify ischemic core-penumbra using only a single MRI modality (T2- or diffusion-weighted imaging, T2WI/DWI). We also validated our approach by comparing core-penumbral images (from HRS) to DPM with immunohistochemical validation of HII tissues. Our translational and clinical data results showed that HRS could accurately and reliably distinguish the ischemic core from penumbra and their spatiotemporal evolution, which may aid in the vetting and execution of effective therapeutic interventions as well as patient selection.

  7. Quantification of invertase activity in ants under field conditions.

    PubMed

    Heil, Martin; Büchler, Rita; Boland, Wilhelm

    2005-02-01

    Invertases (EC 3.2.1.26) are hydrolases that cleave sucrose into the monosacccharides, glucose, and fructose. They play a central role in carbohydrate metabolism of plants and animals. Methods presented so far to quantify invertase activity in ants or other animals have been hampered by the variability in both substrates and products of the enzymatic reaction in animals whose carbohydrate metabolism is highly active. Our method is based on a spectrophotometric quantification of the kinetics of glucose release. We first obtained an equilibrium state summarizing reactions of any carbohydrates and enzymes that are present in the extract. Sucrose was then added to quantify invertase activity as newly released glucose. Invertase activities differed significantly among species of ants. Variances were lowest among individuals from the same colony and highest among different species. When preparations were made from ants of the same species, invertase activity was linearly related to the number of ants used for extraction. Our method does not require ants to be kept on specific substrates prior to the experiment, or expensive or large equipment. It, thus, appears suitable for dealing with a broad range of physiological, ecological, and evolutionary questions.

  8. Quantification of phospholipids in excised tissues by NMR

    SciTech Connect

    Barany, M.; Venkatasubramanian, P.N.

    1986-05-01

    A fraction of the total phospholipids is visible in natural abundance /sup 13/C NMR spectra of diseased human muscle biopsies which have been extracted with isopentane to remove neutral fats. The authors have quantified the visible phospholipids by inserting into the muscle biopsies a dioxane capillary which was calibrated against phospholipid vesicles (with known phosphate concentration) prepared from rat muscle and liver, and against pure palmitic and linolenic acid. The phospholipid content of the human muscles was calculated from the integrated peak area of the dioxane capillary, from the area of the 30.5 and 128.5 ppm peaks in the /sup 13/C spectrum of the muscle, and from the dry weight of the muscle, determined on the same sample which was used for /sup 13/C spectroscopy. The same experiments were carried out with rat muscle, brain, liver, and kidney. Furthermore, after the /sup 13/C spectrum of the tissue was recorded excess halothane was added into the NMR tube, the sample incubated, then the /sup 13/C spectrum re-recorded for quantification of the total phospholipids. Thus, their procedure quantitates both the NMR visible and the total phospholipids of the tissue. The total phospholipid content determined by NMR was in good agreement with that determined by chemical analysis.

  9. Numerical Continuation Methods for Intrusive Uncertainty Quantification Studies

    SciTech Connect

    Safta, Cosmin; Najm, Habib N.; Phipps, Eric Todd

    2014-09-01

    Rigorous modeling of engineering systems relies on efficient propagation of uncertainty from input parameters to model outputs. In recent years, there has been substantial development of probabilistic polynomial chaos (PC) Uncertainty Quantification (UQ) methods, enabling studies in expensive computational models. One approach, termed ”intrusive”, involving reformulation of the governing equations, has been found to have superior computational performance compared to non-intrusive sampling-based methods in relevant large-scale problems, particularly in the context of emerging architectures. However, the utility of intrusive methods has been severely limited due to detrimental numerical instabilities associated with strong nonlinear physics. Previous methods for stabilizing these constructions tend to add unacceptably high computational costs, particularly in problems with many uncertain parameters. In order to address these challenges, we propose to adapt and improve numerical continuation methods for the robust time integration of intrusive PC system dynamics. We propose adaptive methods, starting with a small uncertainty for which the model has stable behavior and gradually moving to larger uncertainty where the instabilities are rampant, in a manner that provides a suitable solution.

  10. Automated quantification of thyrotropin by radial partition immunoassay.

    PubMed

    Rugg, J A; Flaa, C W; Dawson, S R; Rigl, C T; Leung, K S; Evans, S A

    1988-01-01

    We describe a radial partition enzyme immunoassay in which fully automated quantification of human thyrotropin (hTSH) takes less than 11 min. This "sandwich"-type assay involves two monoclonal antibodies, both specific for the intact hTSH molecule. The solid phase consists of tabs of glass-fiber filter paper containing a pre-immobilized monoclonal anti-hTSH antibody complexed with a goat antibody specific for the Fc region of mouse IgG. The patient's sample is first applied to the central "reaction zone" of the tab, wherein hTSH binds to the immobilized antibody. Application of a buffered solution containing enzyme-labeled Fab' fragments of the second monoclonal anti-hTSH antibody initiates "sandwich" formation. A wash buffer containing a fluorogenic substrate elutes unbound conjugate to the tab periphery. The bound enzyme conjugate is quantified by measuring the rate of increase in fluorescence in the reaction zone of the tab, then converting the rate to clinical units by comparison with a stored calibration curve. The clinical utility and performance of the present assay compare favorably with those of other sensitive assays for hTSH.

  11. Experimental model for civilian ballistic brain injury biomechanics quantification.

    PubMed

    Zhang, Jiangyue; Yoganandan, Narayan; Pintar, Frank A; Guan, Yabo; Gennarelli, Thomas A

    2007-01-01

    Biomechanical quantification of projectile penetration using experimental head models can enhance the understanding of civilian ballistic brain injury and advance treatment. Two of the most commonly used handgun projectiles (25-cal, 275 m/s and 9 mm, 395 m/s) were discharged to spherical head models with gelatin and Sylgard simulants. Four ballistic pressure transducers recorded temporal pressure distributions at 308kHz, and temporal cavity dynamics were captured at 20,000 frames/second (fps) using high-speed digital video images. Pressures ranged from 644.6 to -92.8 kPa. Entry pressures in gelatin models were higher than exit pressures, whereas in Sylgard models entry pressures were lower or equivalent to exit pressures. Gelatin responded with brittle-type failure, while Sylgard demonstrated a ductile pattern through formation of micro-bubbles along projectile path. Temporary cavities in Sylgard models were 1.5-2x larger than gelatin models. Pressures in Sylgard models were more sensitive to projectile velocity and diameter increase, indicating Sylgard was more rate sensitive than gelatin. Based on failure patterns and brain tissue rate-sensitive characteristics, Sylgard was found to be an appropriate simulant. Compared with spherical projectile data, full-metal jacket (FMJ) projectiles produced different temporary cavity and pressures, demonstrating shape effects. Models using Sylgard gel and FMJ projectiles are appropriate to enhance understanding and mechanisms of ballistic brain injury.

  12. Investigation of nonlinear pupil dynamics by recurrence quantification analysis.

    PubMed

    Mesin, L; Monaco, A; Cattaneo, R

    2013-01-01

    Pupil is controlled by the autonomous nervous system (ANS). It shows complex movements and changes of size even in conditions of constant stimulation. The possibility of extracting information on ANS by processing data recorded during a short experiment using a low cost system for pupil investigation is studied. Moreover, the significance of nonlinear information contained in the pupillogram is investigated. We examined 13 healthy subjects in different stationary conditions, considering habitual dental occlusion (HDO) as a weak stimulation of the ANS with respect to the maintenance of the rest position (RP) of the jaw. Images of pupil captured by infrared cameras were processed to estimate position and size on each frame. From such time series, we extracted linear indexes (e.g., average size, average displacement, and spectral parameters) and nonlinear information using recurrence quantification analysis (RQA). Data were classified using multilayer perceptrons and support vector machines trained using different sets of input indexes: the best performance in classification was obtained including nonlinear indexes in the input features. These results indicate that RQA nonlinear indexes provide additional information on pupil dynamics with respect to linear descriptors, allowing the discrimination of even a slight stimulation of the ANS. Their use in the investigation of pathology is suggested. PMID:24187665

  13. An improved method for the quantification of SOA bound peroxides

    NASA Astrophysics Data System (ADS)

    Mutzel, Anke; Rodigast, Maria; Iinuma, Yoshiteru; Böge, Olaf; Herrmann, Hartmut

    2013-03-01

    An improvement is made to a method for the quantification of SOA-bound peroxides. The procedure is based on an iodometric-spectrophotometric method that has been commonly used for the determination of peroxides in a wide range of biological and environmental samples. The improved method was applied to determine the peroxide content of laboratory-generated SOA from α-pinene ozonolysis. Besides main improvements for the detection conditions, the use of more environmentally sound solvents is considered instead of carcinogenic solvents. In addition to the improved method for peroxide determination, the present study provides evidence for artefact formation caused by ultrasonic agitation for the extraction of organic compounds in SOA filter samples. The concentration of SOA-bound peroxides in the extracts from ultrasonic agitation were up to three times higher than those from a laboratory orbital shaker under the same extraction conditions, indicating peroxide formation caused by acoustic cavitation during extraction. In contrast, pinic acid, terebic acid and terpenylic acid showed significantly lower concentrations in the sample extract prepared using ultrasonic agitation, indicating that these compounds react with OH radicals that are formed from acoustic cavitation. Great care should be taken when extracting SOA samples and the use of ultrasound should be avoided.

  14. Classification and quantification of bacteriophage taxa in human gut metagenomes

    PubMed Central

    Waller, Alison S; Yamada, Takuji; Kristensen, David M; Kultima, Jens Roat; Sunagawa, Shinichi; Koonin, Eugene V; Bork, Peer

    2014-01-01

    Bacteriophages have key roles in microbial communities, to a large extent shaping the taxonomic and functional composition of the microbiome, but data on the connections between phage diversity and the composition of communities are scarce. Using taxon-specific marker genes, we identified and monitored 20 viral taxa in 252 human gut metagenomic samples, mostly at the level of genera. On average, five phage taxa were identified in each sample, with up to three of these being highly abundant. The abundances of most phage taxa vary by up to four orders of magnitude between the samples, and several taxa that are highly abundant in some samples are absent in others. Significant correlations exist between the abundances of some phage taxa and human host metadata: for example, ‘Group 936 lactococcal phages' are more prevalent and abundant in Danish samples than in samples from Spain or the United States of America. Quantification of phages that exist as integrated prophages revealed that the abundance profiles of prophages are highly individual-specific and remain unique to an individual over a 1-year time period, and prediction of prophage lysis across the samples identified hundreds of prophages that are apparently active in the gut and vary across the samples, in terms of presence and lytic state. Finally, a prophage–host network of the human gut was established and includes numerous novel host–phage associations. PMID:24621522

  15. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  16. Method for Indirect Quantification of CH4 Production via H2O Production Using Hydrogenotrophic Methanogens.

    PubMed

    Taubner, Ruth-Sophie; Rittmann, Simon K-M R

    2016-01-01

    Hydrogenotrophic methanogens are an intriguing group of microorganisms from the domain Archaea. Methanogens exhibit extraordinary ecological, biochemical, and physiological characteristics and possess a huge biotechnological potential. Yet, the only possibility to assess the methane (CH4) production potential of hydrogenotrophic methanogens is to apply gas chromatographic quantification of CH4. In order to be able to effectively screen pure cultures of hydrogenotrophic methanogens regarding their CH4 production potential we developed a novel method for indirect quantification of the volumetric CH4 production rate by measuring the volumetric water production rate. This method was established in serum bottles for cultivation of methanogens in closed batch cultivation mode. Water production was estimated by determining the difference in mass increase in a quasi-isobaric setting. This novel CH4 quantification method is an accurate and precise analytical technique, which can be used to rapidly screen pure cultures of methanogens regarding their volumetric CH4 evolution rate. It is a cost effective alternative determining CH4 production of methanogens over CH4 quantification by using gas chromatography, especially if applied as a high throughput quantification method. Eventually, the method can be universally applied for quantification of CH4 production from psychrophilic, thermophilic and hyperthermophilic hydrogenotrophic methanogens. PMID:27199898

  17. Method for Indirect Quantification of CH4 Production via H2O Production Using Hydrogenotrophic Methanogens

    PubMed Central

    Taubner, Ruth-Sophie; Rittmann, Simon K.-M. R.

    2016-01-01

    Hydrogenotrophic methanogens are an intriguing group of microorganisms from the domain Archaea. Methanogens exhibit extraordinary ecological, biochemical, and physiological characteristics and possess a huge biotechnological potential. Yet, the only possibility to assess the methane (CH4) production potential of hydrogenotrophic methanogens is to apply gas chromatographic quantification of CH4. In order to be able to effectively screen pure cultures of hydrogenotrophic methanogens regarding their CH4 production potential we developed a novel method for indirect quantification of the volumetric CH4 production rate by measuring the volumetric water production rate. This method was established in serum bottles for cultivation of methanogens in closed batch cultivation mode. Water production was estimated by determining the difference in mass increase in a quasi-isobaric setting. This novel CH4 quantification method is an accurate and precise analytical technique, which can be used to rapidly screen pure cultures of methanogens regarding their volumetric CH4 evolution rate. It is a cost effective alternative determining CH4 production of methanogens over CH4 quantification by using gas chromatography, especially if applied as a high throughput quantification method. Eventually, the method can be universally applied for quantification of CH4 production from psychrophilic, thermophilic and hyperthermophilic hydrogenotrophic methanogens. PMID:27199898

  18. The Effect of Peptide Identification Search Algorithms on MS2-Based Label-Free Protein Quantification

    PubMed Central

    Degroeve, Sven; Staes, An; De Bock, Pieter-Jan

    2012-01-01

    Abstract Several approaches exist for the quantification of proteins in complex samples processed by liquid chromatography-mass spectrometry followed by fragmentation analysis (MS2). One of these approaches is label-free MS2-based quantification, which takes advantage of the information computed from MS2 spectrum observations to estimate the abundance of a protein in a sample. As a first step in this approach, fragmentation spectra are typically matched to the peptides that generated them by a search algorithm. Because different search algorithms identify overlapping but non-identical sets of peptides, here we investigate whether these differences in peptide identification have an impact on the quantification of the proteins in the sample. We therefore evaluated the effect of using different search algorithms by examining the reproducibility of protein quantification in technical repeat measurements of the same sample. From our results, it is clear that a search engine effect does exist for MS2-based label-free protein quantification methods. As a general conclusion, it is recommended to address the overall possibility of search engine-induced bias in the protein quantification results of label-free MS2-based methods by performing the analysis with two or more distinct search engines. PMID:22804230

  19. Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay

    PubMed Central

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997

  20. Bacterial adhesion force quantification by fluidic force microscopy

    NASA Astrophysics Data System (ADS)

    Potthoff, Eva; Ossola, Dario; Zambelli, Tomaso; Vorholt, Julia A.

    2015-02-01

    Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many cells. The contact time and setpoint dependence of the adhesion forces of E. coli and Streptococcus pyogenes, as well as the sequential detachment of bacteria out of a chain, are shown, revealing distinct force patterns in the detachment curves. This study demonstrates the potential of the FluidFM technology for quantitative bacterial adhesion measurements of cell-substrate and cell-cell interactions that are relevant in biofilms and infection biology.Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many

  1. Christiansen Revisited: Rethinking Quantification of Uniformity in Rainfall Simulator Studies

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian

    2016-04-01

    Rainfall simulators, whether based within a laboratory or field setting are used extensively within a number of fields of research, including plot-scale runoff, infiltration and erosion studies, irrigation and crop management and scaled investigations into urban flooding. Rainfall simulators offer a number of benefits, including the ability to create regulated and repeatable rainfall characteristics (e.g. intensity, duration, drop size distribution and kinetic energy) without relying on unpredictable natural precipitation regimes. Ensuring and quantifying spatially uniform simulated rainfall across the entirety of the plot area is of particular importance to researchers undertaking rainfall simulation. As a result, numerous studies have focused on the quantification and improvement of uniformity values. Several statistical methods for the assessment of rainfall simulator uniformity have been developed. However, the Christiansen Uniformity Coefficient (CUC) suggested by Christiansen (1942) is most frequently used. Despite this, there is no set methodology and researchers can adapt or alter factors such as the quantity, as well as the spacing, distance and location of the measuring beakers used to derive CUC values. Because CUC values are highly sensitive to the resolution of the data, i.e. the number of observations taken, many densely distributed measuring containers subjected to the same experimental conditions may generate a significantly lower CUC value than fewer, more sparsely distributed measuring containers. Thus, the simulated rainfall under a higher resolution sampling method could appear less uniform than when using a coarser resolution sampling method, despite being derived from the same initial rainfall conditions. Expressing entire plot uniformity as a single, simplified percentage value disregards valuable qualitative information about plot uniformity, such as the small-scale spatial distribution of rainfall over the plot surface and whether these

  2. Improved Uncertainty Quantification in Groundwater Flux Estimation Using GRACE

    NASA Astrophysics Data System (ADS)

    Reager, J. T., II; Rao, P.; Famiglietti, J. S.; Turmon, M.

    2015-12-01

    Groundwater change is difficult to monitor over large scales. One of the most successful approaches is in the remote sensing of time-variable gravity using NASA Gravity Recovery and Climate Experiment (GRACE) mission data, and successful case studies have created the opportunity to move towards a global groundwater monitoring framework for the world's largest aquifers. To achieve these estimates, several approximations are applied, including those in GRACE processing corrections, the formulation of the formal GRACE errors, destriping and signal recovery, and the numerical model estimation of snow water, surface water and soil moisture storage states used to isolate a groundwater component. A major weakness in these approaches is inconsistency: different studies have used different sources of primary and ancillary data, and may achieve different results based on alternative choices in these approximations. In this study, we present two cases of groundwater change estimation in California and the Colorado River basin, selected for their good data availability and varied climates. We achieve a robust numerical estimate of post-processing uncertainties resulting from land-surface model structural shortcomings and model resolution errors. Groundwater variations should demonstrate less variability than the overlying soil moisture state does, as groundwater has a longer memory of past events due to buffering by infiltration and drainage rate limits. We apply a model ensemble approach in a Bayesian framework constrained by the assumption of decreasing signal variability with depth in the soil column. We also discuss time variable errors vs. time constant errors, across-scale errors v. across-model errors, and error spectral content (across scales and across model). More robust uncertainty quantification for GRACE-based groundwater estimates would take all of these issues into account, allowing for more fair use in management applications and for better integration of GRACE

  3. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    NASA Astrophysics Data System (ADS)

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials. This work is a partial contribution of the USDA Forest Service and NIST, agencies of the US government, and is not subject to copyright.

  4. Uncertainty Quantification for CO2-Enhanced Oil Recovery

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Middleton, R.; Bauman, J.; Viswanathan, H.; Fessenden-Rahn, J.; Pawar, R.; Lee, S.

    2013-12-01

    CO2-Enhanced Oil Recovery (EOR) is currently an option for permanently sequestering CO2 in oil reservoirs while increasing oil/gas productions economically. In this study we have developed a framework for understanding CO2 storage potential within an EOR-sequestration environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. By coupling a EOR tool--SENSOR (CEI, 2011) with a uncertainty quantification tool PSUADE (Tong, 2011), we conduct an integrated Monte Carlo simulation of water, oil/gas components and CO2 flow and reactive transport in the heterogeneous Morrow formation to identify the key controlling processes and optimal parameters for CO2 sequestration and EOR. A global sensitivity and response surface analysis are conducted with PSUADE to build numerically the relationship among CO2 injectivity, oil/gas production, reservoir parameters and distance between injection and production wells. The results indicate that the reservoir permeability and porosity are the key parameters to control the CO2 injection, oil and gas (CH4) recovery rates. The distance between the injection and production wells has large impact on oil and gas recovery and net CO2 injection rates. The CO2 injectivity increases with the increasing reservoir permeability and porosity. The distance between injection and production wells is the key parameter for designing an EOR pattern (such as a five (or nine)-spot pattern). The optimal distance for a five-spot-pattern EOR in this site is estimated from the response surface analysis to be around 400 meters. Next, we are building the machinery into our risk assessment framework CO2-PENS to utilize these response surfaces and evaluate the operation risk for CO2 sequestration and EOR at this site.

  5. Quantification of sparfloxacin in pharmaceutical dosages and biological samples.

    PubMed

    Shah, Jasmin; Jan, Muhammad Rasul; Khan, Inayatullah; Khan, Muhammad Naeem

    2012-10-01

    A simple and fast method for spectrophotometric determination of sparfloxacin using p-dimethyl-aminobenzaldehyde (DMAB) has been developed. A yellow coloured product formed from reaction between sparfloxacin and DMAB as a result of condensation reaction at room temperature. The maximum absorbance was found at 392 nm with molar absorptivity of 4.9 × 10(3) L mol(-1) cm(-1). All parameters for the reaction, as concentration of DMBA reagent, molarity of sulphuric acid, and reaction temperature were studied. Under the conditions studied, a linear relationship between absorbance of the condensation product and concentration of sparfloxacin in the range of 2.0-80.0 μg mL(-1) was found with good correlation coefficient (0.9997). The limits of detection (LOD) and quantification (LOQ) for the proposed method were found to be 0.22 and 0.75 μg mL(-1) respectively. The repeatability and accuracy (model) of the method was studied at three different concentrations of sparfloxacin and found with value of relative standard deviation less than 2.0%. The method was found selective for determination of sparfloxacin in the presence of commonly used excipients in dosage forms. The developed method was validated statistically and applied successfully to the analysis of the drug in pure form, pharmaceutical preparations, and spiked blood plasma and urine samples with good accuracy (real) and precision. The percentage recovery was found from 99.0-100.0% with relative standard deviation less than 1%. The results of the proposed method were compared statistically with the results of literature HPLC method.

  6. Quantification of structural distortions in the transmembrane helices of GPCRs.

    PubMed

    Deupi, Xavier

    2012-01-01

    A substantial part of the structural and much of the functional information about G protein-coupled receptors (GPCRs) comes from studies on rhodopsin. Thus, analysis tools for detailed structure comparison are key to see to what extent this information can be extended to other GPCRs. Among the methods to evaluate protein structures and, in particular, helix distortions, HELANAL has the advantage that it provides data (local bend and twist angles) that can be easily translated to structural effects, as a local opening/tightening of the helix.In this work I show how HELANAL can be used to extract detailed structural information of the transmembrane bundle of GPCRs, and I provide some examples on how these data can be interpreted to study basic principles of protein structure, to compare homologous proteins and to study mechanisms of receptor activation. Also, I show how in combination with the sequence analysis tools provided by the program GMoS, distortions in individual receptors can be put in the context of the whole Class A GPCR family. Specifically, quantification of the strong proline-induced distortions in the transmembrane bundle of rhodopsin shows that they are not standard proline kinks. Moreover, the helix distortions in transmembrane helix (TMH) 5 and TMH 6 of rhodopsin are also present in the rest of GPCR crystal structures obtained so far, and thus, rhodopsin-based homology models have modeled correctly these strongly distorted helices. While in some cases the inherent "rhodopsin bias" of many of the GPCR models to date has not been a disadvantage, the availability of more templates will clearly result in better homology models. This type of analysis can be, of course, applied to any protein, and it may be particularly useful for the structural analysis of other membrane proteins. A detailed knowledge of the local structural changes related to ligand binding and how they are translated into larger-scale movements of transmembrane domains is key to

  7. Quantification of plant chlorophyll content using Google Glass.

    PubMed

    Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan

    2015-04-01

    Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation. PMID:25669673

  8. Automated Drusen Segmentation and Quantification in SD-OCT Images

    PubMed Central

    Chen, Qiang; Leng, Theodore; Zheng, Luoluo; Kutzscher, Lauren; Ma, Jeffrey; de Sisternes, Luis; Rubin, Daniel L.

    2013-01-01

    Spectral domain optical coherence tomography (SD-OCT) is a useful tool for the visualization of drusen, a retinal abnormality seen in patients with age-related macular degeneration (AMD); however, objective assessment of drusen is thwarted by the lack of a method to robustly quantify these lesions on serial OCT images. Here, we describe an automatic drusen segmentation method for SD-OCT retinal images, which leverages a priori knowledge of normal retinal morphology and anatomical features. The highly reflective and locally connected pixels located below the retinal nerve fiber layer (RNFL) are used to generate a segmentation of the retinal pigment epithelium (RPE) layer. The observed and expected contours of the RPE layer are obtained by interpolating and fitting the shape of the segmented RPE layer, respectively. The areas located between the interpolated and fitted RPE shapes (which have nonzero area when drusen occurs) are marked as drusen. To enhance drusen quantification, we also developed a novel method of retinal projection to generate an en face retinal image based on the RPE extraction, which improves the quality of drusen visualization over the current approach to producing retinal projections from SD-OCT images based on a summed-voxel projection (SVP), and it provides a means of obtaining quantitative features of drusen in the en face projection. Visualization of the segmented drusen is refined through several post-processing steps, drusen detection to eliminate false positive detections on consecutive slices, drusen refinement on a projection view of drusen, and drusen smoothing. Experimental evaluation results demonstrate that our method is effective for drusen segmentation. In a preliminary analysis of the potential clinical utility of our methods, quantitative drusen measurements, such as area and volume, can be correlated with the drusen progression in non-exudative AMD, suggesting that our approach may produce useful quantitative imaging biomarkers

  9. Concepts and Practice of Verification, Validation, and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Oberkampf, W. L.

    2014-12-01

    Verification and validation (V&V) are the primary means to assess the numerical and physics modeling accuracy, respectively, in computational simulation. Code verification assesses the reliability of the software coding and the numerical algorithms used in obtaining a solution, while solution verification addresses numerical error estimation of the computational solution of a mathematical model for a specified set of initial and boundary conditions. Validation assesses the accuracy of the mathematical model as compared to experimentally measured response quantities of the system being modeled. As these experimental data are typically available only for simplified subsystems or components of the system, model validation commonly provides limited ability to assess model accuracy directly. Uncertainty quantification (UQ), specifically in regard to predictive capability of a mathematical model, attempts to characterize and estimate the total uncertainty for conditions where no experimental data are available. Specific sources of uncertainty that can impact the total predictive uncertainty are: the assumptions and approximations in the formulation of the mathematical model, the error incurred in the numerical solution of the discretized model, the information available for stochastic input data for the system, and the extrapolation of the mathematical model to conditions where no experimental data are available. This presentation will briefly discuss the principles and practices of VVUQ from both the perspective of computational modeling and simulation, as well as the difficult issue of estimating predictive capability. Contrasts will be drawn between weak and strong code verification testing, and model validation as opposed to model calibration. Closing remarks will address what needs to be done to improve the value of information generated by computational simulation for improved decision-making.

  10. Quantification of Plant Chlorophyll Content Using Google Glass

    PubMed Central

    Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan

    2015-01-01

    Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation. PMID:25669673

  11. An anatomically realistic brain phantom for quantification with positron tomography

    SciTech Connect

    Wong, D.F.; Links, J.M.; Molliver, M.E.; Hengst, T.C.; Clifford, C.M.; Buhle, L.; Bryan, M.; Stumpf, M.; Wagner, H.N. Jr.

    1984-01-01

    Phantom studies are useful in assessing and maximizing the accuracy and precision of quantification of absolute activity, assessing errors associated with patient positioning, and dosimetry. Most phantoms are limited by the use of simple shapes, which do not adequately reflect real anatomy. The authors have constructed an anatomically realistic life-size brain phantom for positron tomography studies. The phantom consists of separately fillable R + L caudates, R + L putamens, R + L globus passidus and cerebellum. These structures are contained in proper anatomic orientation within a fillable cerebrum. Solid ventricles are also present. The entire clear vinyl cerebrum is placed in a human skull. The internal brain structures were fabricated from polyester resin, with dimensions, shapes and sizes of the structures obtained from digitized contours of brain slices in the U.C.S.D. computerized brain atlas. The structures were filled with known concentrations of Ga-68 in water and scanned with our NeuroECAT. The phantom was aligned in the scanner for each structure, such that the tomographic slice passed through that structure's center. After calibration of the scanner with a standard phantom for counts/pixel uCi/cc conversion, the measured activity concentrations were compared with the actual concentrations. The ratio of measured to actual activity concentration (''recovery coefficient'') for the caudate was 0.33; for the putamen 0.42. For comparison, the ratio for spheres of diameters 9.5, 16,19 and 25.4 mm was 0.23, 0.54, 0.81, and 0.93. This phantom provides more realistic assessment of performance and allows calculation of correction factors.

  12. Uncertainty Quantification of Calculated Temperatures for the AGR-1 Experiment

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson; Grant L. Hawkes

    2012-04-01

    This report documents an effort to quantify the uncertainty of the calculated temperature data for the first Advanced Gas Reactor (AGR-1) fuel irradiation experiment conducted in the INL's Advanced Test Reactor (ATR) in support of the Next Generation Nuclear Plant (NGNP) R&D program. Recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, the results of the numerical simulations can be used in combination with the statistical analysis methods to improve qualification of measured data. Additionally, the temperature simulation data for AGR tests can be used for validation of the fuel transport and fuel performance simulation models. The crucial roles of the calculated fuel temperatures in ensuring achievement of the AGR experimental program objectives require accurate determination of the model temperature uncertainties. The report is organized into three chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program and provides overviews of AGR-1 measured data, AGR-1 test configuration and test procedure, and thermal simulation. Chapters 2 describes the uncertainty quantification procedure for temperature simulation data of the AGR-1 experiment, namely, (i) identify and quantify uncertainty sources; (ii) perform sensitivity analysis for several thermal test conditions; (iii) use uncertainty propagation to quantify overall response temperature uncertainty. A set of issues associated with modeling uncertainties resulting from the expert assessments are identified. This also includes the experimental design to estimate the main effects and interactions of the important thermal model parameters. Chapter 3 presents the overall uncertainty results for the six AGR-1 capsules. This includes uncertainties for the daily volume-average and peak fuel temperatures, daily average temperatures at TC locations, and time-average volume-average and time-average peak fuel temperatures.

  13. Uncertainty Quantification of Calculated Temperatures for the AGR-1 Experiment

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson; Grant L. Hawkes

    2013-03-01

    This report documents an effort to quantify the uncertainty of the calculated temperature data for the first Advanced Gas Reactor (AGR-1) fuel irradiation experiment conducted in the INL’s Advanced Test Reactor (ATR) in support of the Next Generation Nuclear Plant (NGNP) R&D program. Recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, the results of the numerical simulations can be used in combination with the statistical analysis methods to improve qualification of measured data. Additionally, the temperature simulation data for AGR tests can be used for validation of the fuel transport and fuel performance simulation models. The crucial roles of the calculated fuel temperatures in ensuring achievement of the AGR experimental program objectives require accurate determination of the model temperature uncertainties. The report is organized into three chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program and provides overviews of AGR-1 measured data, AGR-1 test configuration and test procedure, and thermal simulation. Chapters 2 describes the uncertainty quantification procedure for temperature simulation data of the AGR-1 experiment, namely, (i) identify and quantify uncertainty sources; (ii) perform sensitivity analysis for several thermal test conditions; (iii) use uncertainty propagation to quantify overall response temperature uncertainty. A set of issues associated with modeling uncertainties resulting from the expert assessments are identified. This also includes the experimental design to estimate the main effects and interactions of the important thermal model parameters. Chapter 3 presents the overall uncertainty results for the six AGR-1 capsules. This includes uncertainties for the daily volume-average and peak fuel temperatures, daily average temperatures at TC locations, and time-average volume-average and time-average peak fuel temperatures.

  14. Local quantification of numerically-induced mixing and dissipation

    NASA Astrophysics Data System (ADS)

    Klingbeil, Knut; Mohammadi-Aragh, Mahdi; Gräwe, Ulf; Burchard, Hans

    2016-04-01

    The discretisation of the advection terms in transport equations introduces truncation errors in numerical models. These errors are usually associated with spurious diffusion, i.e. numerically-induced mixing of the advected quantities or dissipation of kinetic energy associated with the advection of momentum. Especially the numerically-induced diapycnal mixing part is very problematic for realistic model simulations. Since any diapycnal mixing of temperature and salinity increases the reference potential energy (RPE), numerically-induced mixing is often quantified in terms of RPE. However, this global bulk measure does not provide any information about the local amount of numerically-induced mixing of a single advected quantity. In this talk we will present a recently developed analysis method that quantifies the numerically-induced mixing of a single advected quantity locally (Klingbeil et al., 2014***). The method is based on the local tracer variance decay in terms of variance fluxes associated with the corresponding advective tracer fluxes. Because of its physically sound definition, this analysis method provides a reliable diagnostic tool, e.g., to assess the performance of advection schemes and to identify hotspots of numerically-induced mixing. At these identified positions the model could be adapted in terms of resolution or the applied numerical schemes. In this context we will demonstrate how numerically-induced mixing of temperature and salinity can be substantially reduced by vertical meshes adapting towards stratification. *** Klingbeil, K., M. Mohammadi-Aragh, U. Gräwe, H. Burchard (2014) . Quantification of spurious dissipation and mixing -- Discrete Variance Decay in a Finite-Volume framework. Ocean Modelling. doi:10.1016/j.ocemod.2014.06.001.

  15. Quantification of greenhouse gas emissions from sludge treatment wetlands.

    PubMed

    Uggetti, Enrica; García, Joan; Lind, Saara E; Martikainen, Pertti J; Ferrer, Ivet

    2012-04-15

    Constructed wetlands are nowadays successfully employed as an alternative technology for wastewater and sewage sludge treatment. In these systems organic matter and nutrients are transformed and removed by a variety of microbial reaction and gaseous compounds such as methane (CH(4)) and nitrous oxide (N(2)O) may be released to the atmosphere. The aim of this work is to introduce a method to determine greenhouse gas emissions from sludge treatment wetlands (STW) and use the method in a full-scale system. Sampling and analysing techniques used to determine greenhouse gas emissions from croplands and natural wetlands were successfully adapted to the quantification of CH(4) and N(2)O emissions from an STW. Gas emissions were measured using the static chamber technique in 9 points of the STW during 13 days. The spatial variation in the emission along the wetland did not follow some specific pattern found for the temporal variation in the fluxes. Emissions ranged from 10 to 5400 mg CH(4)/m(2)d and from 20 to 950 mgN(2)O/m(2)d, depending on the feeding events. The comparison between the CH(4) and N(2)O emissions of different sludge management options shows that STW have the lowest atmospheric impact in terms of CO(2) equivalent emissions (Global warming potential with time horizon of 100 years): 17 kg CO(2) eq/PE y for STW, 36 kg CO(2) eq/PE y for centrifuge and 162 kg CO(2) eq/PE y for untreated sludge transport, PE means Population Equivalent.

  16. Quantification of plant chlorophyll content using Google Glass.

    PubMed

    Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan

    2015-04-01

    Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation.

  17. Automated renal histopathology: digital extraction and quantification of renal pathology

    NASA Astrophysics Data System (ADS)

    Sarder, Pinaki; Ginley, Brandon; Tomaszewski, John E.

    2016-03-01

    The branch of pathology concerned with excess blood serum proteins being excreted in the urine pays particular attention to the glomerulus, a small intertwined bunch of capillaries located at the beginning of the nephron. Normal glomeruli allow moderate amount of blood proteins to be filtered; proteinuric glomeruli allow large amount of blood proteins to be filtered. Diagnosis of proteinuric diseases requires time intensive manual examination of the structural compartments of the glomerulus from renal biopsies. Pathological examination includes cellularity of individual compartments, Bowman's and luminal space segmentation, cellular morphology, glomerular volume, capillary morphology, and more. Long examination times may lead to increased diagnosis time and/or lead to reduced precision of the diagnostic process. Automatic quantification holds strong potential to reduce renal diagnostic time. We have developed a computational pipeline capable of automatically segmenting relevant features from renal biopsies. Our method first segments glomerular compartments from renal biopsies by isolating regions with high nuclear density. Gabor texture segmentation is used to accurately define glomerular boundaries. Bowman's and luminal spaces are segmented using morphological operators. Nuclei structures are segmented using color deconvolution, morphological processing, and bottleneck detection. Average computation time of feature extraction for a typical biopsy, comprising of ~12 glomeruli, is ˜69 s using an Intel(R) Core(TM) i7-4790 CPU, and is ~65X faster than manual processing. Using images from rat renal tissue samples, automatic glomerular structural feature estimation was reproducibly demonstrated for 15 biopsy images, which contained 148 individual glomeruli images. The proposed method holds immense potential to enhance information available while making clinical diagnoses.

  18. Concurrent Quantification of Cellular and Extracellular Components of Biofilms

    PubMed Central

    Khajotia, Sharukh S.; Smart, Kristin H.; Pilula, Mpala; Thompson, David M.

    2013-01-01

    Confocal laser scanning microscopy (CLSM) is a powerful tool for investigation of biofilms. Very few investigations have successfully quantified concurrent distribution of more than two components within biofilms because: 1) selection of fluorescent dyes having minimal spectral overlap is complicated, and 2) quantification of multiple fluorochromes poses a multifactorial problem. Objectives: Report a methodology to quantify and compare concurrent 3-dimensional distributions of three cellular/extracellular components of biofilms grown on relevant substrates. Methods: The method consists of distinct, interconnected steps involving biofilm growth, staining, CLSM imaging, biofilm structural analysis and visualization, and statistical analysis of structural parameters. Biofilms of Streptococcus mutans (strain UA159) were grown for 48 hr on sterile specimens of Point 4 and TPH3 resin composites. Specimens were subsequently immersed for 60 sec in either Biotène PBF (BIO) or Listerine Total Care (LTO) mouthwashes, or water (control group; n=5/group). Biofilms were stained with fluorochromes for extracellular polymeric substances, proteins and nucleic acids before imaging with CLSM. Biofilm structural parameters calculated using ISA3D image analysis software were biovolume and mean biofilm thickness. Mixed models statistical analyses compared structural parameters between mouthwash and control groups (SAS software; α=0.05). Volocity software permitted visualization of 3D distributions of overlaid biofilm components (fluorochromes). Results: Mouthwash BIO produced biofilm structures that differed significantly from the control (p<0.05) on both resin composites, whereas LTO did not produce differences (p>0.05) on either product. Conclusions: This methodology efficiently and successfully quantified and compared concurrent 3D distributions of three major components within S. mutans biofilms on relevant substrates, thus overcoming two challenges to simultaneous assessment of

  19. Stochastic simulations of ocean waves: An uncertainty quantification study

    NASA Astrophysics Data System (ADS)

    Yildirim, B.; Karniadakis, George Em

    2015-02-01

    The primary objective of this study is to introduce a stochastic framework based on generalized polynomial chaos (gPC) for uncertainty quantification in numerical ocean wave simulations. The techniques we present can be easily extended to other numerical ocean simulation applications. We perform stochastic simulations using a relatively new numerical method to simulate the HISWA (Hindcasting Shallow Water Waves) laboratory experiment for directional near-shore wave propagation and induced currents in a shallow-water wave basin. We solve the phased-averaged equation with hybrid discretization based on discontinuous Galerkin projections, spectral elements, and Fourier expansions. We first validate the deterministic solver by comparing our simulation results against the HISWA experimental data as well as against the numerical model SWAN (Simulating Waves Nearshore). We then perform sensitivity analysis to assess the effects of the parametrized source terms, current field, and boundary conditions. We employ an efficient sparse-grid stochastic collocation method that can treat many uncertain parameters simultaneously. We find that the depth-induced wave-breaking coefficient is the most important parameter compared to other tunable parameters in the source terms. The current field is modeled as random process with large variation but it does not seem to have a significant effect. Uncertainty in the source terms does not influence significantly the region before the submerged breaker whereas uncertainty in the incoming boundary conditions does. Considering simultaneously the uncertainties from the source terms and boundary conditions, we obtain numerical error bars that contain almost all experimental data, hence identifying the proper range of parameters in the action balance equation.

  20. Quantification of Structural Isomers via Mode-Selective Irmpd

    NASA Astrophysics Data System (ADS)

    Polfer, Nicolas C.

    2016-06-01

    Mixtures of structural isomers can pose a challenge for vibrational ion spectroscopy. In cases where particular structures display diagnostic vibrations, these structures can be selectively "burned away". In ion traps, the ion population can be subjected to multiple laser shots, in order to fully deplete a particular structure, in effect allowing a quantification of this structure. Protonated para-amino benzoic acid (PABA) serves as an illustrative example. PABA is known to preferentially exist in the N-protonated (N-prot) form in solution, but in the gas phase it is energetically favorable in the O-protonated (O-prot) form. As shown in Figure 1, the N-prot structure can be kinetically trapped in the gas phase when sprayed from non-protic solvent, whereas the O-prot structure is obtained when sprayed from protic solvents, analogous to results by others [1,2]. y parking the light source on the diagnostic 3440 wn mode, the percentage of the O-prot structure can be determined, and by default the remainder is assumed to adopt the N-prot structure. It will be shown that the relative percentages of O-prot vs N-prot are highly dependent on the solvent mixture, going from close to 0% O-prot in non-protic solvents, to 99% in protic solvents. Surprisingly, water behaves much more like a non-protic solvent than methanol. It is observed that the capillary temperature, which aids droplet desolvation by black-body radiation in the ESI source, is critical to promote the appearance of O-prot structures. These results are consistent with the picture that a protic bridge mechanism is at play to facilitate proton transfer, and thus allow conversion from N-prot to O-prot, but that this mechanism is subject to appreciable kinetic barriers on the timescale of solvent evaporation. 1. J. Phys. Chem. A 2011, 115, 7625. 2. Anal. Chem. 2012, 84, 7857.

  1. Identification and Quantification of Volatile Organic Compounds at a Dairy

    NASA Astrophysics Data System (ADS)

    Filipy, J.; Mount, G.; Westberg, H.; Rumburg, B.

    2003-12-01

    Livestock operations in the United States are an escalating environmental concern. The increasing density of livestock within a farm results in an increased emission of odorous gases, which have gained considerable attention by the public in recent years (National Research Council (NRC), 2002). Odorous compounds such as ammonia (NH3), volatile organic compounds (VOC's), and hydrogen sulfide (H2S) were reported to have a major effect on the quality of life of local residents living near livestock facilities (NRC, 2002). There has been little data collected related to identification and quantification of gaseous compounds collected from open stall dairy operations in the United States. The research to be presented identifies and quantifies VOCs produced from a dairy operation that contribute to odor and other air quality problems. Many different VOCs were identified in the air downwind of an open lactating cow stall area and near a waste lagoon at the Washington State University dairy using Gas Chromatography Mass Spectroscopy (GC-MS) analysis techniques. Identified compounds were very diverse and included many alcohols, aldehydes, amines, aromatics, esters, ethers, a fixed gas, halogenated hydrocarbons, hydrocarbons, ketones, other nitrogen containing compounds, sulfur containing compounds, and terpenes. The VOCs directly associated with cattle waste were dependent on ambient temperature, with the highest emissions produced during the summer months. Low to moderate wind speeds were ideal for VOC collection. Concentrations of quantified compounds were mostly below odor detection thresholds found in the literature, however the combined odor magnitude of the large number of compounds detected was most likely above any minimum detection threshold.

  2. Quantification of adipose tissue in a rodent model of obesity

    NASA Astrophysics Data System (ADS)

    Johnson, David H.; Flask, Chris; Wan, Dinah; Ernsberger, Paul; Wilson, David L.

    2006-03-01

    Obesity is a global epidemic and a comorbidity for many diseases. We are using MRI to characterize obesity in rodents, especially with regard to visceral fat. Rats were scanned on a 1.5T clinical scanner, and a T1W, water-spoiled image (fat only) was divided by a matched T1W image (fat + water) to yield a ratio image related to the lipid content in each voxel. The ratio eliminated coil sensitivity inhomogeneity and gave flat values across a fat pad, except for outlier voxels (> 1.0) due to motion. Following sacrifice, fat pad volumes were dissected and measured by displacement in canola oil. In our study of 6 lean (SHR), 6 dietary obese (SHR-DO), and 9 genetically obese rats (SHROB), significant differences in visceral fat volume was observed with an average of 29+/-16 ml increase due to diet and 84+/-44 ml increase due to genetics relative to lean control with a volume of 11+/-4 ml. Subcutaneous fat increased 14+/-8 ml due to diet and 198+/-105 ml due to genetics relative to the lean control with 7+/-3 ml. Visceral fat strongly correlated between MRI and dissection (R2 = 0.94), but MRI detected over five times the subcutaneous fat found with error-prone dissection. Using a semi-automated images segmentation method on the ratio images, intra-subject variation was very low. Fat pad composition as estimated from ratio images consistently differentiated the strains with SHROB having a greater lipid concentration in adipose tissues. Future work will include in vivo studies of diet versus genetics, identification of new phenotypes, and corrective measures for obesity; technical efforts will focus on correction for motion and automation in quantification.

  3. Quantification of leukocyte migration: improvement of a method.

    PubMed

    Sunder-Plassmann, G; Hofbauer, R; Sengoelge, G; Hörl, W H

    1996-01-01

    Eighteen different permeable membrane supports with and without confluent endothelial cell monolayers were incubated with normal donor derived neutrophils in the upper chambers of a 24 multiwell double chamber system. In order to study transmembrane or transendothelial leukocyte migration leukocytes were stimulated by chemoattractants, or endothelial cells were activated by IL-1. After coincubation the membrane supports building the upper chambers were discarded. Using this technique, leukocytes that had migrated into the lower chamber were exposed to the fluorescent dye calcein AM without additional washing or transfer steps. Absolute cell counts were determined computer assisted using dilution series of calcein AM labeled leukocytes as standards. Serial dilutions of neutrophils exposed to calcein AM showed reproducible linear fluorescence intensity, and relative fluorescence intensity correlated significant with cell counts (r2 = 0.974, p < 0.0001). Out of 18 membrane supports only one was suitable for our assay set up. Best technical and optical performance was achieved with a membrane made of polyethylene terephtalate with a pore size of 3 mm at a pore density of 0.8 x 10(6)/cm2. Stimulation of leukocytes or endothelium by FMLP or IL-1 revealed an increase of transendothelial migration to 7.2 +/- 1.8 x 10(5) PMN and 5.1 +/- 0.7 x 10(5) PMN respectively if compared with medium (0.6 +/- 0.2 x 10(5) PMN). IL-1 induced migration of neutrophils was inhibited by anti IL-1 autoantibodies derived from chronic renal failure patients (IL-1: 100% of PMN migrated, anti IL-1 antibody: 39% of PMN migrated, control antibody: 84% of PMN migrated). In summary, a simple fluorimetric assay was established for the quantification of transmembrane and transendothelial leukocyte migration. PMID:8675234

  4. Quantification of NSW Ambulance Record Linkages with Multiple External Datasets.

    PubMed

    Carroll, Therese; Muecke, Sandy; Simpson, Judy; Irvine, Katie; Jenkins, André

    2015-01-01

    This study has two aims: 1) to describe linkage rates between ambulance data and external datasets for "episodes of care" and "patient only" linkages in New South Wales (NSW), Australia; and 2) to detect and report any systematic issues with linkage that relate to patients, and operational or clinical variables that may introduce bias in subsequent studies if not adequately addressed. During 2010-11, the Centre for Health Record Linkage (CHeReL) in NSW, linked the records for patients attended by NSW Ambulance paramedics for the period July 2006 to June 2009, with four external datasets: Emergency Department Data Collection; Admitted Patient Data Collection; NSW Registry of Births, Deaths and Marriages death registration data; and the Australian Bureau of Statistics mortality data. This study reports linkage rates in terms of those "expected" to link and those who were "not expected" to link with external databases within 24 hours of paramedic attendance. Following thorough data preparation processes, 2,041,728 NSW Ambulance care episodes for 1,116,509 patients fulfilled the inclusion criteria. The overall episode-specific hospital linkage rate was 97.2%. Where a patient was not transported to hospital following paramedic care, 8.6% of these episodes resulted in an emergency department attendance within 24 hours. For all care episodes, 5.2% linked to a death record at some time within the 3-year period, with 2.4% of all death episodes occurring within 7 days of a paramedic encounter. For NSW Ambulance episodes of care that were expected to link to an external dataset but did not, nonlinkage to hospital admission records tended to decrease with age. For all other variables, issues relating to rates of linkage and nonlinkage were more indiscriminate. This quantification of the limitations of this large linked dataset will underpin the interpretation and results of ensuing studies that will inform future clinical and operational policies and practices at NSW Ambulance.

  5. Quantification of the Balance Error Scoring System with Mobile Technology

    PubMed Central

    Alberts, Jay L.; Thota, Anil; Hirsch, Joshua; Ozinga, Sarah; Dey, Tanujit; Schindler, David D.; Koop, Mandy Miller; Burke, Daniel; Linder, Susan M.

    2015-01-01

    Purpose The aim of this project was to develop a biomechanically based quantification of the Balance Error Scoring System (BESS) using data derived from the accelerometer and gyroscope of a mobile tablet device. Methods Thirty-two healthy youth and adults completed the BESS while an iPad was positioned at the sacrum. Data from the iPad data was compared to position data gathered from a 3D motion capture system. Peak-to-peak (P2P), normalized path length (NPL), and root mean squared (RMS) were calculated for each system and compared. Additionally, a 95% ellipsoid volume, iBESS volume, was calculated using center of mass (COM) movements in the anterior-posterior (AP), mediolateral (ML), and trunk rotation planes of movement to provide a comprehensive, 3-dimensional metric of postural stability. Results Across all kinematic outcomes, data from the iPad were significantly correlated with the same outcomes derived from the motion capture system (Rho range: 0.37- 0.94, p<0.05). The iBESS volume metric was able to detect a difference in postural stability across stance and surface, showing a significant increase in volume in increasingly difficult conditions, while traditional error scoring was not as sensitive to these factors. Conclusions The kinematic data provided by the iPad is of sufficient quality relative to motion capture data to accurately quantify postural stability in healthy young adults. The iBESS volume provides a more sensitive measure of postural stability than error scoring alone, particularly in conditions 1 and 4, which often suffer from floor effects, and condition 5, which can experience ceiling effects. The iBESS metric is ideally suited for clinical and in the field applications in which characterizing postural stability is of interest. PMID:26378948

  6. Dating and quantification of erosion processes based on exposed roots

    NASA Astrophysics Data System (ADS)

    Stoffel, Markus; Corona, Christophe; Ballesteros-Cánovas, Juan Antonio; Bodoque, José Maria

    2013-08-01

    Soil erosion is a key driver of land degradation and heavily affects sustainable land management in various environments worldwide. An appropriate quantification of rates of soil erosion and a localization of hotspots are therefore critical, as sediment loss has been demonstrated to have drastic consequences on soil productivity and fertility. A consistent body of evidence also exists for a causal linkage between global changes and the temporal frequency and magnitude of erosion, and thus calls for an improved understanding of dynamics and rates of soil erosion for an appropriate management of landscapes and for the planning of preventive or countermeasures. Conventional measurement techniques to infer erosion rates are limited in their temporal resolution or extent. Long-term erosion rates in larger basins have been analyzed with cosmogenic nuclides, but with lower spatial and limited temporal resolutions, thus limiting the possibility to infer micro-geomorphic and climatic controls on the timing, amount and localization of erosion. If based on exposed tree roots, rates of erosion can be inferred with up to seasonal resolution, over decades to centuries of the past and for larger surfaces with homogenous hydrological response units. Root-based erosion rates, thus, constitute a valuable alternative to empirical or physically-based approaches, especially in ungauged basins, but will be controlled by individual or a few extreme events, so that average annual rates of erosion might be highly skewed. In this contribution, we review the contribution made by this biomarker to the understanding of erosion processes and related landform evolution. We report on recent progress in root-based erosion research, illustrate possibilities, caveats and limitations of reconstructed rates, and conclude with a call for further research on various aspects of root-erosion research and for work in new geographic regions.

  7. Towards the quantification of rockfall risk assessment for urban areas

    NASA Astrophysics Data System (ADS)

    Mavrouli, Olga; Corominas, Jordi

    2010-05-01

    In many mountainous inhabited areas rockfalls are a major threat for structures and population. The quantification of the risk gives an estimate of the potential consequences that allows the analysis of different scenarios, minimizing the subjectivity and the uncertainties that derive from judgmental and qualitative approaches. The four main phases of the rockfall phenomenon have to be determined including: a. the calculation of the frequency of the rock block volumes falling down the slope, b. the calculation of the probability of the rock blocks reaching a reference section with a certain level of kinetic energy; c. the calculation of the spatio-temporal probability of the exposed elements; and d. the calculation of the probability that an exposed element will suffer a certain degree of damage. Here, a step-by-step methodology for the quantification of risk is presented. The methodology focuses on steps (b) to (d). An example of an urban area that is situated at the toe of a talus cone below of a rocky slope is considered. Three different rock diameters are considered with their respective frequencies (step a). For the calculation of the spatial probability of a given rock size reaching a location, a probabilistic 3D trajectory analysis is performed using the software ROTOMAP. The inputs are the topographic relief, the rockfall source and velocity and the soil parameters (restitution coefficient and friction coefficients). The latter are evaluated by back analysis using historical events. The probability of a given rock magnitude reaching a critical section of the talus cone with a certain level of kinetic energy is evaluated. For the step (c), the spatio-temporal probability of the element at risk is calculated taking into account both the trajectographic analysis of the rock blocks and the location of the elements at risk on the talus cone. For the step (d), the probability of a certain degree of structural damage in the buildings is calculated. To this purpose

  8. A statistical modeling approach to computer-aided quantification of dental biofilm.

    PubMed

    Mansoor, Awais; Patsekin, Valery; Scherl, Dale; Robinson, J Paul; Rajwa, Bartlomiej

    2015-01-01

    Biofilm is a formation of microbial material on tooth substrata. Several methods to quantify dental biofilm coverage have recently been reported in the literature, but at best they provide a semiautomated approach to quantification with significant input from a human grader that comes with the grader's bias of what is foreground, background, biofilm, and tooth. Additionally,human assessment indices limit the resolution of the quantification scale; most commercial scales use five levels of quantification for biofilm coverage (0%, 25%, 50%, 75%, and 100%). On the other hand, current state-of-the-art techniques in automatic plaque quantification fail to make their way into practical applications owing to their inability to incorporate human input to handle misclassifications. This paper proposes a new interactive method for biofilm quantification in Quantitative light-induced fluorescence(QLF) images of canine teeth that is independent of the perceptual bias of the grader. The method partitions a QLF image into segments of uniform texture and intensity called superpixels; every superpixel is statistically modeled as a realization of a single 2-D Gaussian Markov random field (GMRF) whose parameters are estimated; the superpixel is then assigned to one of three classes (background, biofilm, tooth substratum) based on the training set of data. The quantification results show a high degree of consistency and precision. At the same time, the proposed method gives pathologists full control to postprocess the automatic quantification by flipping misclassified superpixels to a different state (background,tooth, biofilm) with a single click, providing greater usability than simply marking the boundaries of biofilm and tooth as done by current state-of-the-art methods.

  9. Quantification of low levels of amorphous content in crystalline celecoxib using dynamic vapor sorption (DVS).

    PubMed

    Sheokand, Sneha; Modi, Sameer R; Bansal, Arvind K

    2016-05-01

    A minor amount of amorphous phase, especially present on the surface of crystalline pharmaceutical actives, can have a significant impact on their processing and performance. Despite the presence of sophisticated analytical tools, detection and quantification of low levels of amorphous content pose significant challenges owing to issues of sensitivity, suitability, limit of detection and limit of quantitation. Current study encompasses the quantification of amorphous content in the crystalline form of celecoxib (CLB) using a dynamic vapor sorption (DVS) based method. Water, used as the solvent probe, achieved equilibration within a very short period of time (i.e. 6h) due to hydrophobic nature of CLB, thus allowing development of a rapid quantification method. The study included optimization of instrument and sample related parameters for the development of an analytical method. The calibration curve for amorphous CLB in crystalline CLB was prepared in the concentration range of 0-10% w/w. The analytical method was validated for linearity, range, accuracy and precision. The method for quantification was found to be linear with R(2) value of 0.999, rapid and sensitive for quantification of low levels of amorphous CLB content. It was able to detect the presence of amorphous phase in a predominantly crystalline phase at concentrations as low as 0.3% w/w. The limit of quantitation was found to be 0.9% w/w. Moreover, the influence of mechanical processing on the amorphous content in crystalline CLB was also investigated. PMID:26948976

  10. Localized 2D COSY sequences: Method and experimental evaluation for a whole metabolite quantification approach

    NASA Astrophysics Data System (ADS)

    Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène

    2015-11-01

    Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.

  11. Influence of absorption and scattering on the quantification of fluorescence diffuse optical tomography using normalized data.

    PubMed

    Abascal, Juan Felipe Perez-Juste; Aguirre, Juan; Chamorro-Servent, Judit; Schweiger, Martin; Arridge, Simon; Ripoll, Jorge; Vaquero, Juan J; Desco, Manuel

    2012-03-01

    Reconstruction algorithms for imaging fluorescence in near infrared ranges usually normalize fluorescence light with respect to excitation light. Using this approach, we investigated the influence of absorption and scattering heterogeneities on quantification accuracy when assuming a homogeneous model and explored possible reconstruction improvements by using a heterogeneous model. To do so, we created several computer-simulated phantoms: a homogeneous slab phantom (P1), slab phantoms including a region with a two- to six-fold increase in scattering (P2) and in absorption (P3), and an atlas-based mouse phantom that modeled different liver and lung scattering (P4). For P1, reconstruction with the wrong optical properties yielded quantification errors that increased almost linearly with the scattering coefficient while they were mostly negligible regarding the absorption coefficient. This observation agreed with the theoretical results. Taking the quantification of a homogeneous phantom as a reference, relative quantification errors obtained when wrongly assuming homogeneous media were in the range +41 to +94% (P2), 0.1 to -7% (P3), and -39 to +44% (P4). Using a heterogeneous model, the overall error ranged from -7 to 7%. In conclusion, this work demonstrates that assuming homogeneous media leads to noticeable quantification errors that can be improved by adopting heterogeneous models.

  12. Influence of absorption and scattering on the quantification of fluorescence diffuse optical tomography using normalized data

    NASA Astrophysics Data System (ADS)

    Abascal, Juan Felipe Perez-Juste; Aguirre, Juan; Chamorro-Servent, Judit; Schweiger, Martin; Arridge, Simon; Ripoll, Jorge; Vaquero, Juan J.; Desco, Manuel

    2012-03-01

    Reconstruction algorithms for imaging fluorescence in near infrared ranges usually normalize fluorescence light with respect to excitation light. Using this approach, we investigated the influence of absorption and scattering heterogeneities on quantification accuracy when assuming a homogeneous model and explored possible reconstruction improvements by using a heterogeneous model. To do so, we created several computer-simulated phantoms: a homogeneous slab phantom (P1), slab phantoms including a region with a two- to six-fold increase in scattering (P2) and in absorption (P3), and an atlas-based mouse phantom that modeled different liver and lung scattering (P4). For P1, reconstruction with the wrong optical properties yielded quantification errors that increased almost linearly with the scattering coefficient while they were mostly negligible regarding the absorption coefficient. This observation agreed with the theoretical results. Taking the quantification of a homogeneous phantom as a reference, relative quantification errors obtained when wrongly assuming homogeneous media were in the range +41 to +94% (P2), 0.1 to -7% (P3), and -39 to +44% (P4). Using a heterogeneous model, the overall error ranged from -7 to 7%. In conclusion, this work demonstrates that assuming homogeneous media leads to noticeable quantification errors that can be improved by adopting heterogeneous models.

  13. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  14. Quantification soil production and erosion using isotopic techniques

    NASA Astrophysics Data System (ADS)

    Dosseto, Anthony; Suresh, P. O.

    2010-05-01

    Soil is a critical resource, especially in the context of a rapidly growing world's population. Thus, it is crucial to be able to quantify how soil resources evolve with time and how fast they become depleted. Over the past few years, the application of cosmogenic isotopes has permitted to constrain rates of soil denudation. By assuming constant soil thickness, it is also possible to use these denudation rates to infer soil production rates (Heimsath et al. 1997). However, in this case, it is not possible to discuss any imbalance between erosion and production, which is the core question when interested in soil resource sustainability. Recently, the measurement of uranium-series isotopes in soils has been used to quantify the residence time of soil material in the weathering profile and to infer soil production rates (Dequincey et al. 2002; Dosseto et al. 2008). Thus, the combination of U-series and cosmogenic isotopes can be used to discuss how soil resources evolve with time, whether they are depleting, increasing or in steady-state. Recent work has been undertaken in temperate southeastern Australia where a several meters thick saprolite is developed over a graniodioritc bedrock and underlains a meter or less of soil (Dosseto et al., 2008) and in tropical Puerto Rico, also in a granitic catchment. Results show that in an environment where human activity is minimal, soil and saprolite are renewed as fast as they are destroyed through denudation. Further work is investigating these processes at other sites in southeastern Australia (Frogs Hollow; Heimsath et al. 2001) and Puerto Rico (Rio Mameyes catchment; andesitic bedrock). Results will be presented and a review of the quantification of the rates of soil evolution using isotopic techniques will be given. Dequincey, O., F. Chabaux, et al. (2002). Chemical mobilizations in laterites: Evidence from trace elements and 238U-234U-230Th disequilibria. Geochim. Cosmochim. Acta 66(7): 1197-1210. Dosseto, A., S. P

  15. Quantification of Emphysema: A Bullae Distribution Based Approach

    NASA Astrophysics Data System (ADS)

    Tan, Kok Liang; Tanaka, Toshiyuki; Nakamura, Hidetoshi; Shirahata, Toru; Sugiura, Hiroaki

    Computed tomography (CT)-based quantifications of emphysema encompass, and are not limited to, the ratio of the low-attenuation area, the bullae size, and the distribution of bullae in the lung. The standard CT-based emphysema describing indices include the mean lung density, the percentage of area of low attenuation [the pixel index (PI)] and the bullae index (BI). These standard emphysema describing indices are not expressive for describing the distribution of bullae in the lung. Consequently, the goal of this paper is to present a new emphysema describing index, the bullae congregation index (BCI), that describes whether bullae gather in a specific area of the lung and form a nearly single mass, and if so, how dense the mass of bullae is in the lung. BCI ranges from zero to ten corresponding to sparsely distributed bullae to densely distributed bullae. BCI is calculated based on the relative distance between every pair of bullae in the lung. The bullae pair distances are sorted into 200 distance classes. A smaller distance class corresponds to a closer proximity between the bullae. BCI is derived by calculating the percentage of the area of bullae in the lung that are separated by a certain distance class. Four bullae congregation classes are defined based on BCI. We evaluate BCI using 114 CT images that are hand-annotated by a radiologist into four bullae congregation classes. The average four-class classification accuracy of BCI is 88.21%. BCI correlates better than PI, BI and other standard statistical dispersion based methods with the radiological consensus-classified bullae congregation class.While BCI is not a specific index for indicating emphysema severity, it complements the existing set of emphysema describing indices to facilitate a more thorough knowledge about the emphysematous conditions in the lung. BCI is especially useful when it comes to comparing the distribution of bullae for cases with approximately the same PI, BI or PI and BI. BCI is easy

  16. Quantification and propagation of disciplinary uncertainty via Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Mantis, George Constantine

    2002-08-01

    Several needs exist in the military, commercial, and civil sectors for new hypersonic systems. These needs remain unfulfilled, due in part to the uncertainty encountered in designing these systems. This uncertainty takes a number of forms, including disciplinary uncertainty, that which is inherent in the analytical tools utilized during the design process. Yet, few efforts to date empower the designer with the means to account for this uncertainty within the disciplinary analyses. In the current state-of-the-art in design, the effects of this unquantifiable uncertainty significantly increase the risks associated with new design efforts. Typically, the risk proves too great to allow a given design to proceed beyond the conceptual stage. To that end, the research encompasses the formulation and validation of a new design method, a systematic process for probabilistically assessing the impact of disciplinary uncertainty. The method implements Bayesian Statistics theory to quantify this source of uncertainty, and propagate its effects to the vehicle system level. Comparison of analytical and physical data for existing systems, modeled a priori in the given analysis tools, leads to quantification of uncertainty in those tools' calculation of discipline-level metrics. Then, after exploration of the new vehicle's design space, the quantified uncertainty is propagated probabilistically through the design space. This ultimately results in the assessment of the impact of disciplinary uncertainty on the confidence in the design solution: the final shape and variability of the probability functions defining the vehicle's system-level metrics. Although motivated by the hypersonic regime, the proposed treatment of uncertainty applies to any class of aerospace vehicle, just as the problem itself affects the design process of any vehicle. A number of computer programs comprise the environment constructed for the implementation of this work. Application to a single

  17. Robust Radiomics feature quantification using semiautomatic volumetric segmentation.

    PubMed

    Parmar, Chintan; Rios Velazquez, Emmanuel; Leijenaar, Ralph; Jermoumi, Mohammed; Carvalho, Sara; Mak, Raymond H; Mitra, Sushmita; Shankar, B Uma; Kikinis, Ron; Haibe-Kains, Benjamin; Lambin, Philippe; Aerts, Hugo J W L

    2014-01-01

    Due to advances in the acquisition and analysis of medical imaging, it is currently possible to quantify the tumor phenotype. The emerging field of Radiomics addresses this issue by converting medical images into minable data by extracting a large number of quantitative imaging features. One of the main challenges of Radiomics is tumor segmentation. Where manual delineation is time consuming and prone to inter-observer variability, it has been shown that semi-automated approaches are fast and reduce inter-observer variability. In this study, a semiautomatic region growing volumetric segmentation algorithm, implemented in the free and publicly available 3D-Slicer platform, was investigated in terms of its robustness for quantitative imaging feature extraction. Fifty-six 3D-radiomic features, quantifying phenotypic differences based on tumor intensity, shape and texture, were extracted from the computed tomography images of twenty lung cancer patients. These radiomic features were derived from the 3D-tumor volumes defined by three independent observers twice using 3D-Slicer, and compared to manual slice-by-slice delineations of five independent physicians in terms of intra-class correlation coefficient (ICC) and feature range. Radiomic features extracted from 3D-Slicer segmentations had significantly higher reproducibility (ICC = 0.85±0.15, p = 0.0009) compared to the features extracted from the manual segmentations (ICC = 0.77±0.17). Furthermore, we found that features extracted from 3D-Slicer segmentations were more robust, as the range was significantly smaller across observers (p = 3.819e-07), and overlapping with the feature ranges extracted from manual contouring (boundary lower: p = 0.007, higher: p = 5.863e-06). Our results show that 3D-Slicer segmented tumor volumes provide a better alternative to the manual delineation for feature quantification, as they yield more reproducible imaging descriptors. Therefore, 3D-Slicer can be

  18. Perceived training intensity and performance changes quantification in judo.

    PubMed

    Agostinho, Marcus F; Philippe, Antony G; Marcolino, Gilvan S; Pereira, Ewerton R; Busso, Thierry; Candau, Robin B; Franchini, Emerson

    2015-06-01

    The objective of this study was to determine the methods of quantification for training and performance, which would be the most appropriate for modeling the responses to long-term training in cadet and junior judo athletes. For this, 10 young male judo athletes (15.9 ± 1.3 years, 64.9 ± 10.3 kg, and 170.8 ± 5.4 cm) competing at a regional/state level volunteered to take part in this study. Data were collected during a 2-year training period (i.e., 702 days) from January 2011 to December 2012. Their mean training volume was 6.52 ± 0.43 hours per week during the preparatory periods and 4.75 ± 0.49 hours per week during the competitive periods. They followed a training program prescribed by the same coach. The training load (TL) was quantified through the session rating of perceived exertion (RPE) and expressed in arbitrary unit (a.u.). Performance was quantified from 5 parameters and divided into 2 categories: performance in competition and performance in training. The evaluation of performance in competition was based on the number of points per level. Performance in training was assessed through 4 different tests. A physical test battery consisting of a standing long jump, 2 judo-specific tests that were the maximal number of dynamic chin-up holding the judogi, and the Special Judo Fitness Test was used. System modeling for describing training adaptations consisted of mathematically relating the TL of the training sessions (system input) to the change in performance (system output). The quality of the fit between TL and performance was similar, whether the TL was computed directly from RPE (R = 0.55 ± 0.18) or from the session RPE (R = 0.56 ± 0.18) and was significant in 8 athletes over 10, excluding the standing jump from the computation of the TL, leading to a simplest method. Thus, this study represents a first attempt to model TL effects on judo-specific performance and has shown that the best relationships between amounts of training and changes in

  19. Quantification of the proliferation of arbuscular mycorrhizal fungi in soil

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Lilje, Osu; McGee, Peter

    2013-04-01

    -computer aided tomography. Micro-computer aided tomography provides three dimensional images of hyphal ramification through electron lucent materials and enables the visualization and quantification of hyphae. Starch and the mixture of starch plus K2HPO4, stimulated hyphal proliferation, while K2HPO4 alone did not change the density of hyphae. The images also indicate that fungal hyphae attached to the surfaces of the particles rather than grow through the spaces between them. The capacity to quantify hyphae in three-dimensional space allows a wide range of questions to now be addressed. Apart from studying mechanisms of carbon turnover, more complex processes may now be considered. Soil is commonly thought of as a black box. That black box is now a shade of grey.

  20. Quantification of sediment budgets at an arctic delta

    NASA Astrophysics Data System (ADS)

    Kroon, A.; Bendixen, M.; Sigsgaard, C.

    2012-12-01

    , we focus on quantification of sediment budgets of the Zackenberg delta in North-East Greenland. We use daily observations of river discharges and associated sediment loads, annual observations of delta shorelines, and numerical model results to estimate the coastal evolution and its associated budgets. Besides, we use daily camera images of the coastal waters in front of the delta to determine ice-coverage and ice-free periods in the fjords and estimate the season when traditional wave and tide-driven processes are active.

  1. Quantification of arterial plaque and lumen density with MDCT

    SciTech Connect

    Paul, Narinder S.; Blobel, Joerg; Kashani, Hany; Rice, Murray; Ursani, Ali

    2010-08-15

    Purpose: This study aimed to derive a mathematical correction function in order to normalize the CT number measurements for small volume arterial plaque and small vessel mimicking objects, imaged with multidetector CT (MDCT). Methods: A commercially available calcium plaque phantom (QRM GmbH, Moehrendorf, Germany) and a custom built cardiovascular phantom were scanned with 320 and 64 MDCT scanners. The calcium hydroxyapatite plaque phantom contained objects 0.5-5.0 mm in diameter with known CT attenuation nominal values ranging 50-800 HU. The cardiovascular phantom contained vessel mimicking objects 1.0-5.0 mm in diameter with different contrast media. Both phantoms were scanned using clinical protocols for CT angiography and images were reconstructed with different filter kernels. The measured CT number (HU) and diameter of each object were analyzed on three clinical postprocessing workstations. From the resultant data, a mathematical formula was derived based on absorption function exp(-{mu}{sup *}d) to demonstrate the relation between measured CT numbers and object diameters. Results: The percentage reduction in measured CT number (HU) for the group of selected filter kernels, apparent during CT angiography, is dependent only on the object size (plaque or vessel diameter). The derived formula of the form 1-c{sup *}exp(-a{sup *}d{sup b}) showed reduction in CT number for objects between 0.5 and 5 mm in diameter, with asymptote reaching background noise for small objects with diameters nearing the CT in-plane resolution (0.35 mm). No reduction was observed for the objects with diameters equal or larger than 5 mm. Conclusions: A clear mathematical relationship exists between object diameter and reduction in measured CT number in HU. This function is independent of exposure parameters and inherent attenuation properties of the objects studied. Future developments include the incorporation of this mathematical model function into quantification software in order to

  2. Uncertainty quantification of bacterial aerosol neutralization in shock heated gases

    NASA Astrophysics Data System (ADS)

    Schulz, J. C.; Gottiparthi, K. C.; Menon, S.

    2015-01-01

    A potential method for the neutralization of bacterial endospores is the use of explosive charges since the high thermal and mechanical stresses in the post-detonation flow are thought to be sufficient in reducing the endospore survivability to levels that pose no significant health threat. While several experiments have attempted to quantify endospore survivability by emulating such environments in shock tube configurations, numerical simulations are necessary to provide information in scenarios where experimental data are difficult to obtain. Since such numerical predictions require complex, multi-physics models, significant uncertainties could be present. This work investigates the uncertainty in determining the endospore survivability from using a reduced order model based on a critical endospore temperature. Understanding the uncertainty in such a model is necessary in quantifying the variability in predictions using large-scale, realistic simulations of bacterial endospore neutralization by explosive charges. This work extends the analysis of previous large-scale simulations of endospore neutralization [Gottiparthi et al. in (Shock Waves, 2014. doi:10.1007/s00193-014-0504-9)] by focusing on the uncertainty quantification of predicting endospore neutralization. For a given initial mass distribution of the bacterial endospore aerosol, predictions of the intact endospore percentage using nominal values of the input parameters match the experimental data well. The uncertainty in these predictions are then investigated using the Dempster-Shafer theory of evidence and polynomial chaos expansion. The studies show that the endospore survivability is governed largely by the endospore's mass distribution and their exposure or residence time at the elevated temperatures and pressures. Deviations from the nominal predictions can be as much as 20-30 % in the intermediate temperature ranges. At high temperatures, i.e., strong shocks, which are of the most interest, the

  3. Uncertainty Quantification in Ocean State Estimation using Hessian Information

    NASA Astrophysics Data System (ADS)

    Kalmikov, A.; Heimbach, P.

    2012-12-01

    We present a second derivative-based (Hessian) method for Uncertainty Quantification (UQ) in large-scale Ocean State Estimation. Matrix-free Hessian-times-vector code of the MIT General Circulation Model (MITgcm) is generated by means of algorithmic differentiation (AD). Lanczos-type numerical algebra tools are then applied for extracting leading rank eigenvectors and eigenvalues used in the UQ algorithm. Computational complexity is reduced by tangent linear (forward-mode) differentiation of the adjoint code, which preserves the efficiency of the checkpointing schemes. The inverse and forward uncertainty propagation algorithm is designed for assimilating observation and control variable uncertainties, and for projecting these uncertainties onto oceanographically relevant target quantities of interest. The algorithm evaluates both reduction of a priori-assumed uncertainty as well as prior-independent information gain. The inverse propagation maps prior and data uncertainties onto posterior uncertainties in each component of the high-dimensional control space. The forward propagation of the posteriors provides a measure of uncertainties of the target quantity. The time-resolving analysis of uncertainty propagation in the ocean model reveals transient and steady state uncertainty regimes. The system is applied to quantifying uncertainties in Drake Passage transport in a global barotropic configuration of the MITgcm. The model is constrained by synthetic observations of sea surface height and velocities. The control space consists of two-dimensional maps of initial conditions (velocities and sea surface height), surface boundary conditions (wind stress), and model parameters (bottom drag), amounting to a 10^5-dimensional space of uncertain variables. It is demonstrated how the choice of observations and their geographic coverage determines the reduction in uncertainties of the estimated transport. The system also yields information on how well control parameters are

  4. Quantification of Uncertainties in Projections of Hydro-meteorological Extremes

    NASA Astrophysics Data System (ADS)

    Meresa, Hadush; Romanowicz, Renata; Lawrence, Deborah

    2016-04-01

    The impact of climate change on hydrological extremes has been widely studied particularly after the publication of the IPCC AR4 report in 2007. The methodology applied to derive hydrological extremes under climate change adopted by most scientists consists of running a cascade of models, starting from assumed emission scenarios applied to a global circulation model (GCM) and ending at hydrological model simulations. Therefore, the projected hydro-meteorological extremes are highly uncertain due to uncertainties inherent in all the links of the modelling chain. In addition, due to the complexity of hydrologic models that use a large number of parameters to characterize hydrologic processes, many challenges arise with respect to quantification of uncertainty. This issue needs to be properly quantified to understand possible confidence ranges in extremes in the future. This paper aims to quantify the uncertainty in the hydrological projection of future extremes in streamflow and precipitation indices in mountainous and lowland catchments in Poland, using a multi-model approach based on climate projections obtained from the ENSMEBLE and EUROCORDEX projects, multiple realizations of catchment scale downscaled rainfalls, two hydrological models (HBV and GR4J) and a number of hydrological model parameters. The time-span of projections covers the 21st century. The potential sources of hydrological projection uncertainties are quantified through a Monte Carlo based simulation approach. We compare the weights based on different goodness-of-fit criteria in their ability to constrain the uncertainty of the extremes. The results of the comparison show a considerable dependence of uncertainty ranges on the type of extremes (low or high flows) and on the criterion used. The predicted distribution of future streamflows considering all sources of uncertainty (climate model, bias correction and hydrological model) is used to derive marginal distributions of uncertainty related to

  5. Pitot-tube flowmeter for quantification of airflow during sleep.

    PubMed

    Kirkness, J P; Verma, M; McGinley, B M; Erlacher, M; Schwartz, A R; Smith, P L; Wheatley, J R; Patil, S P; Amis, T C; Schneider, H

    2011-02-01

    validate the pitot flowmeter for quantification of airflow and detecting breathing reduction during polysomnographic sleep studies. We speculate that quantifying airflow during sleep can differentiate phenotypic traits related to sleep disordered breathing. PMID:21178245

  6. Quantification of Iron Oxides and Hydroxides in Desert Aeolian Particles

    NASA Astrophysics Data System (ADS)

    Lafon, S.; Rajot, J.; Alfaro, S.; Gaudichet, A.

    2002-12-01

    Long range transport of desert dust over oceans constitute a source of iron for the surface water. Assessing the iron cycle and its biogeochemical implications in oceanic areas requires determination and quantification of the iron status in aeolian particles. Indeed, in such aerosols, the iron is either trapped in the silicate structure or present under the form of oxides and hydroxides (free iron). We propose a method to apportion iron between free and entrapped forms in mineral aerosols. It consists in the adaptation of a well known method used for soil characterization to the treatment of aerosol samples, which represent less than 1 mg of material collected by air filtration on polycarbonate filters. The iron oxides and hydroxides are extracted selectively using the combined action of reductive and complexant agents in a buffered solution. The iron content is measured before and after this chemical extraction using X ray fluorescence spectrometry. We attempt to give some values for three main desert source areas using aerosol samples collected near Niamey (Niger) either during Harmattan events or during local erosion events, and samples collected downwind of the Gobi desert in China. Results emphasize firstly that iron trapped in the structure of silicate minerals represents an important part of total iron content. This suggests that, regarding dissolution processes in sea water, total elemental iron content of aeolian dust can not be used directly to calculate the flux of iron available. Secondly, our results show that the free iron content vary according to the origin of dusts. Niger samples have contents in free iron of 4.4 % (SD = 0.8) for local erosion and 2.8 % (SD = 1.0) for Harmattan. Chinese samples contain 3.7 % (SD = 0.5) of free iron. These differences could be linked to the parent soil mineralogical composition that varies with geographical location, but for some of our samples it also could be linked to a size fractionation process occurring first

  7. Direct liquid chromatography method for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines.

    PubMed

    Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen

    2011-11-01

    A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.

  8. Application of synchrotron radiation computed microtomography for quantification of bone microstructure in human and rat bones

    NASA Astrophysics Data System (ADS)

    Nogueira, Liebert Parreiras; Barroso, Regina Cély; de Almeida, André Pereira; Braz, Delson; de Almeida, Carlos Eduardo; de Andrade, Cherley Borba; Tromba, Giuliana

    2012-05-01

    This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 μm tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software. Resolution yielded was excellent what facilitate quantification of bone microstructures.

  9. Nanotechnology-based strategies for the detection and quantification of microRNA.

    PubMed

    Degliangeli, Federica; Pompa, Pier Paolo; Fiammengo, Roberto

    2014-07-28

    MicroRNAs (miRNAs) are important regulators of gene expression, and many pathological conditions, including cancer, are characterized by altered miRNA expression levels. Therefore, accurate and sensitive quantification of miRNAs may result in correct disease diagnosis establishing these small noncoding RNA transcripts as valuable biomarkers. Aiming at overcoming some limitations of conventional quantification strategies, nanotechnology is currently providing numerous significant alternatives to miRNA sensing. In this review an up-to-date account of nanotechnology-based strategies for miRNA detection and quantification is given. The topics covered are: nanoparticle-based approaches in solution, sensing based on nanostructured surfaces, combined nanoparticle/surface sensing approaches, and single-molecule approaches.

  10. Methods for Quantification of Soil-Transmitted Helminths in Environmental Media: Current Techniques and Recent Advances.

    PubMed

    Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V

    2015-12-01

    Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment.

  11. Application of synchrotron radiation computed microtomography for quantification of bone microstructure in human and rat bones

    SciTech Connect

    Parreiras Nogueira, Liebert; Barroso, Regina Cely; Pereira de Almeida, Andre; Braz, Delson; Almeida, Carlos Eduardo de; Borba de Andrade, Cherley; Tromba, Giuliana

    2012-05-17

    This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 {mu}m tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software. Resolution yielded was excellent what facilitate quantification of bone microstructures.

  12. Automatic 3D Shape Severity Quantification and Localization for Deformational Plagiocephaly

    PubMed Central

    Atmosukarto, Indriyati; Shapiro, Linda G.; Cunningham, Michael L.; Speltz, Matthew

    2009-01-01

    Recent studies have shown an increase in the occurrence of deformational plagiocephaly and brachycephaly in children. This increase has coincided with the “Back to Sleep” campaign that was introduced to reduce the risk of Sudden Infant Death Syndrome (SIDS). However, there has yet to be an objective quantification of the degree of severity for these two conditions. Most diagnoses are done on subjective factors such as patient history and physician examination. The existence of an objective quantification would help research in areas of diagnosis and intervention measures, as well as provide a tool for finding correlation between the shape severity and cognitive outcome. This paper describes a new shape severity quantification and localization method for deformational plagiocephaly and brachycephaly. Our results show that there is a positive correlation between the new shape severity measure and the scores entered by a human expert. PMID:21103039

  13. Simultaneous digital quantification and fluorescence-based size characterization of massively parallel sequencing libraries.

    PubMed

    Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H

    2013-08-01

    Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.

  14. Optimization of diclofenac quantification from wastewater treatment plant sludge by ultrasonication assisted extraction.

    PubMed

    Topuz, Emel; Sari, Sevgi; Ozdemir, Gamze; Aydin, Egemen; Pehlivanoglu-Mantas, Elif; Okutman Tas, Didem

    2014-05-01

    A rapid quantification method of diclofenac from sludge samples through ultrasonication assisted extraction and solid phase extraction (SPE) was developed and used for the quantification of diclofenac concentrations in sludge samples with liquid chromatography/tandem mass spectrometry (LC-MS/MS). Although the concentration of diclofenac in sludge samples taken from different units of wastewater treatment plants in Istanbul was below the limit of quantification (LOQ; 5ng/g), an optimized method for sludge samples along with the total mass balances in a wastewater treatment plant can be used to determine the phase with which diclofenac is mostly associated. Hence, the results will provide information on fate and transport of diclofenac, as well as on the necessity of alternative removal processes. In addition, since the optimization procedure is provided in detail, it is possible for other researchers to use this procedure as a starting point for the determination of other emerging pollutants in wastewater sludge samples. PMID:24704687

  15. A high throughout semi-quantification method for screening organic contaminants in river sediments.

    PubMed

    Bu, Qingwei; Wang, Donghong; Liu, Xin; Wang, Zijian

    2014-10-01

    A high throughout semi-quantification method for screening nearly 900 organic contaminants (OCs) in river sediments has been developed. For most OCs tested, concentrations calculated from the proposed semi-quantification method deviated from actual values by a factor of 4. The overall recovery tests indicated that most OCs can be successfully extracted from sediments with recovery rates from 84.1 to 128.6%. To demonstrate the effectiveness of our method towards OC quantification, we screened OCs from sediments collected from the Haihe River basin. Seventy unregulated OCs (including pesticides, flame retardants, PPCPs, etc.) were identified and quantified at concentrations up to 2600 ng/g from 24 sediment samples. From these results, it is confirmed that the developed method is a useful way to fulfill a comprehensive analysis of OCs in sediments and would be valuable for the identification and prioritization of priority pollutants in watershed management.

  16. Quantification and normalization of noise variance with sparsity regularization to enhance diffuse optical tomography

    PubMed Central

    Yao, Jixing; Tian, Fenghua; Rakvongthai, Yothin; Oraintara, Soontorn; Liu, Hanli

    2015-01-01

    Conventional reconstruction of diffuse optical tomography (DOT) is based on the Tikhonov regularization and the white Gaussian noise assumption. Consequently, the reconstructed DOT images usually have a low spatial resolution. In this work, we have derived a novel quantification method for noise variance based on the linear Rytov approximation of the photon diffusion equation. Specifically, we have implemented this quantification of noise variance to normalize the measurement signals from all source-detector channels along with sparsity regularization to provide high-quality DOT images. Multiple experiments from computer simulations and laboratory phantoms were performed to validate and support the newly developed algorithm. The reconstructed images demonstrate that quantification and normalization of noise variance with sparsity regularization (QNNVSR) is an effective reconstruction approach to greatly enhance the spatial resolution and the shape fidelity for DOT images. Since noise variance can be estimated by our derived expression with relatively limited resources available, this approach is practically useful for many DOT applications. PMID:26309760

  17. Quantification of nerolidol in mouse plasma using gas chromatography–mass spectrometry

    PubMed Central

    Saito, Alexandre Yukio; Sussmann, Rodrigo Antonio Ceschini; Kimura, Emilia Akemi; Cassera, Maria Belen; Katzin, Alejandro Miguel

    2015-01-01

    Nerolidol is a naturally occurring sesquiterpene found in the essential oils of many types of flowers and plants. It is frequently used in cosmetics, as a food flavoring agent, and in cleaning products. In addition, nerolidol is used as a skin penetration enhancer for transdermal delivery of therapeutic drugs. However, nerolidol is hemolytic at low concentrations. A simple and fast GC–MS method was developed for preliminary quantification and assessment of biological interferences of nerolidol in mouse plasma after oral dosing. Calibration curves were linear in the concentration range of 0.010–5 μg/mL nerolidol in mouse plasma with correlation coefficients (r) greater than 0.99. Limits of detection and quantification were 0.0017 and 0.0035 μg/mL, respectively. The optimized method was successfully applied to the quantification of nerolidol in mouse plasma. PMID:25880240

  18. Direct potentiometric quantification of histamine using solid-phase imprinted nanoparticles as recognition elements.

    PubMed

    Basozabal, Itsaso; Guerreiro, Antonio; Gomez-Caballero, Alberto; Aranzazu Goicolea, M; Barrio, Ramón J

    2014-08-15

    A new potentiometric sensor based on molecularly imprinted nanoparticles produced via the solid-phase imprinting method was developed. For histamine quantification, the nanoparticles were incorporated within a membrane, which was then used to fabricate an ion-selective electrode. The use of nanoparticles with high affinity and specificity allowed for label-free detection/quantification of histamine in real samples with short response times. The sensor could selectively quantify histamine in presence of other biogenic amines in real wine and fish matrices. The limit of detection achieved was 1.12×10(-6)molL(-1), with a linear range between 10(-6) and 10(-2)molL(-1) and a response time below 20s, making the sensor as developed a promising tool for direct quantification of histamine in the food industry.

  19. Antioxidant Activity and Validation of Quantification Method for Lycopene Extracted from Tomato.

    PubMed

    Cefali, Letícia Caramori; Cazedey, Edith Cristina Laignier; Souza-Moreira, Tatiana Maria; Correa, Marcos Antônio; Salgado, Hérida Regina Nunes; Isaac, Vera Lucia Borges

    2015-01-01

    Lycopene is a carotenoid found in tomatoes with potent antioxidant activity. The aim of the study was to obtain an extract containing lycopene from four types of tomatoes, validate a quantification method for the extracts by HPLC, and assess its antioxidant activity. Results revealed that the tomatoes analyzed contained lycopene and antioxidant activity. Salad tomato presented the highest concentration of this carotenoid and antioxidant activity. The quantification method exhibited linearity with a correlation coefficient of 0.9992. Tests for the assessment of precision, accuracy, and robustness achieved coefficients with variation of less than 5%. The LOD and LOQ were 0.0012 and 0.0039 μg/mL, respectively. Salad tomato can be used as a source of lycopene for the development of topical formulations, and based on performed tests, the chosen method for the identification and quantification of lycopene was considered to be linear, precise, exact, selective, and robust. PMID:26525253

  20. Critical review of current and emerging quantification methods for the development of influenza vaccine candidates.

    PubMed

    Manceur, Aziza P; Kamen, Amine A

    2015-11-01

    Significant improvements in production and purification have been achieved since the first approved influenza vaccines were administered 75 years ago. Global surveillance and fast response have limited the impact of the last pandemic in 2009. In case of another pandemic, vaccines can be generated within three weeks with certain platforms. However, our Achilles heel is at the quantification level. Production of reagents for the quantification of new vaccines using the SRID, the main method formally approved by regulatory bodies, requires two to three months. The impact of such delays can be tragic for vulnerable populations. Therefore, efforts have been directed toward developing alternative quantification methods, which are sensitive, accurate, easy to implement and independent of the availability of specific reagents. The use of newly-developed antibodies against a conserved region of hemagglutinin (HA), a surface protein of influenza, holds great promises as they are able to recognize multiple subtypes of influenza; these new antibodies could be used in immunoassays such as ELISA and slot-blot analysis. HA concentration can also be determined using reversed-phase high performance liquid chromatography (RP-HPLC), which obviates the need for antibodies but still requires a reference standard. The number of viral particles can be evaluated using ion-exchange HPLC and techniques based on flow cytometry principles, but non-viral vesicles have to be taken into account with cellular production platforms. As new production systems are optimized, new quantification methods that are adapted to the type of vaccine produced are required. The nature of these new-generation vaccines might dictate which quantification method to use. In all cases, an alternative method will have to be validated against the current SRID assay. A consensus among the scientific community would have to be reached so that the adoption of new quantification methods would be harmonized between

  1. Application of relative quantification TaqMan real-time polymerase chain reaction technology for the identification and quantification of Thunnus alalunga and Thunnus albacares.

    PubMed

    Lopez, Itziar; Pardo, Miguel Angel

    2005-06-01

    A novel one-step methodology based on real-time Polymerase Chain Reaction (PCR) technology has been developed for the identification of two of the most valuable tuna species. Nowadays, species identification of seafood products has a major concern due to the importing to Europe of new species from other countries. To achieve this aim, two specific TaqMan systems were devised to identify Thunnus alalunga and Thunnus albacares. Another system specific to Scombroidei species was devised as a consensus system. In addition, a relative quantification methodology was carried out to quantify T. alalunga and T. albacares in mixtures after the relative amount of the target was compared with the consensus. This relative quantification methodology does not require a known amount of standard, allowing the analysis of many more samples together and saving costs and time. The utilization of real-time PCR does not require sample handling, preventing contamination and resulting in much faster and higher throughput results. PMID:15913324

  2. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    PubMed

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area

  3. Quantification of hydrogen peroxide during the low-temperature oxidation of alkanes

    PubMed Central

    Bahrini, Chiheb; Herbinet, Olivier; Glaude, Pierre-Alexandre; Schoemaecker, Coralie; Fittschen, Christa; Battin-Leclerc, Frédérique

    2013-01-01

    The first reliable quantification of hydrogen peroxide (H2O2) formed during the low temperature oxidation of an organic compound has been achieved thanks to a new system that couples a jet stirred reactor to a detection by continuous wave cavity ring-down spectroscopy (cw-CRDS) in the near infrared. The quantification of this key compound for hydrocarbon low-temperature oxidation regime has been obtained under conditions close to those actually observed before the autoignition. The studied hydrocarbon was n-butane, the smallest alkane which has an oxidation behaviour close to that of the species present in gasoline and diesel fuels. PMID:22746212

  4. Dakota uncertainty quantification methods applied to the NEK-5000 SAHEX model.

    SciTech Connect

    Weirs, V. Gregory

    2014-03-01

    This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.

  5. Uncertainty quantification in the presence of limited climate model data with discontinuities.

    SciTech Connect

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2009-12-01

    Uncertainty quantification in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We develop a methodology that performs uncertainty quantification in this context in the presence of limited data.

  6. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    PubMed

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area

  7. Don't forget methylmalonic acid quantification in symptomatic exclusively breast-fed infants.

    PubMed

    Van Noolen, L; Nguyen-Morel, M A; Faure, P; Corne, C

    2014-08-01

    Vitamin B12 deficiency can lead to serious haematological and neurological signs in infants. The reported clinical cases of vitamin B12 deficiency were found in exclusively breast-fed infants whose asymptomatic mothers were diagnosed later with pernicious anaemia. For the infants, the diagnosis required urinary methylmalonic acid quantification (grossly elevated in these two cases) and treatment rapidly improved the clinical signs. These cases underline the serious consequences of vitamin B12 deficiency in infants and the helpful role of early methylmalonic acid quantification for diagnosis.

  8. In vivo quantification of cochlin in glaucomatous DBA/2J mice using optical coherence tomography

    PubMed Central

    Wang, Jianhua; Aljohani, Ayman; Carreon, Teresia; Gregori, Giovanni; Bhattacharya, Sanjoy K.

    2015-01-01

    The expression of cochlin in the trabecular meshwork (TM) precedes the clinical glaucoma symptoms in DBA/2J mice. The ability to quantify cochlin in the local tissue (TM) offers potential diagnostic and prognostic values. We present two (spectroscopic and magnetomotive) optical coherence tomography (OCT) approaches for in vivo cochlin quantification in a periodic manner. The cochlin-antibody OCT signal remains stable for up to 24 hours as seen at 3.5 hours after injection allowing for repeated quantification in the living mouse eyes. PMID:26047051

  9. Nanomagnetic competition assay for low-abundance protein biomarker quantification in unprocessed human sera.

    PubMed

    Li, Yuanpeng; Srinivasan, Balasubramanian; Jing, Ying; Yao, Xiaofeng; Hugger, Marie A; Wang, Jian-Ping; Xing, Chengguo

    2010-03-31

    A novel giant magnetoresistive sensor and uniform high-magnetic-moment FeCo nanoparticles (12.8 nm)-based detecting platform with minimized detecting distance was developed for rapid biomolecule quantification from body fluids. Such a system demonstrates specific, accurate, and quick detection and quantification of interleukin-6, a low-abundance protein and a potential cancer biomarker, directly in 4 muL of unprocessed human sera. This platform is expected to facilitate the identification and validation of disease biomarkers. It may eventually lead to a low-cost personal medical device for chronic disease early detection, diagnosis, and prognosis.

  10. Quantification of biofilm exopolysaccharides using an in situ assay with periodic acid-Schiff reagent.

    PubMed

    Randrianjatovo-Gbalou, I; Girbal-Neuhauser, E; Marcato-Romain, C-E

    2016-05-01

    A novel approach to the quantification of extracellular polysaccharides in miniaturized biofilms presenting a wide variety of extracellular matrices was developed. The assay used the periodic acid-Schiff reagent and was first calibrated on dextran and alginate solutions. Then it was implemented on 24-h and 48-h biofilms from three strains known to produce different exopolymeric substances (Pseudomonas aeruginosa, Bacillus licheniformis, Weissella confusa). The assay allowed quantification of the total exopolysaccharides, taking into account possible interferences due to cells or other main expolymers of the matrix (eDNA, proteins).

  11. Synthesis of nanodiamond derivatives carrying amino functions and quantification by a modified Kaiser test

    PubMed Central

    Jarre, Gerald; Heyer, Steffen; Memmel, Elisabeth; Meinhardt, Thomas

    2014-01-01

    Summary Nanodiamonds functionalized with different organic moieties carrying terminal amino groups have been synthesized. These include conjugates generated by Diels–Alder reactions of ortho-quinodimethanes formed in situ from pyrazine and 5,6-dihydrocyclobuta[d]pyrimidine derivatives. For the quantification of primary amino groups a modified photometric assay based on the Kaiser test has been developed and validated for different types of aminated nanodiamond. The results correspond well to values obtained by thermogravimetry. The method represents an alternative wet-chemical quantification method in cases where other techniques like elemental analysis fail due to unfavourable combustion behaviour of the analyte or other impediments. PMID:25550737

  12. 76 FR 29752 - Nomination of In Vitro Test Methods for Detection and Quantification of Botulinum Neurotoxins and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-23

    ... meeting (67 FR 23323), comments and data are ] requested by June 2, 2011. NICEATM and ICCVAM will accept... for the Detection and Quantification of BoNTs In 2006, NICEATM and ICCVAM convened a workshop... for the detection and quantification of BoNTs. These tests include the in vitro BoTest TM and...

  13. Secondary Students' Quantification of Ratio and Rate: A Framework for Reasoning about Change in Covarying Quantities

    ERIC Educational Resources Information Center

    Johnson, Heather Lynn

    2015-01-01

    Contributing to a growing body of research addressing secondary students' quantitative and covariational reasoning, the multiple case study reported in this article investigated secondary students' quantification of ratio and rate. This article reports results from a study investigating students' quantification of rate and ratio as…

  14. Quantification of Water Erosion on Subalpine Grassland with Rain Simulators

    NASA Astrophysics Data System (ADS)

    Schindler, Y.; Alewell, Ch.; Burri, K.; Bänninger, D.

    2009-04-01

    Intensive land use and increasing storm events trigger rain erosion, thus its quantification is important. The aim of this study was to asses the influence of the vegetation on runoff and water erosion in an alpine grassland area. Further, we estimated the influence of vegetation on the soil characteristics matrix stability and C/N ratio and assessed the relationship between those parameters as well as the grain size distribution with erosion and runoff rate. To test the above hypotheses a field spray nozzles drop former hybrid simulator, consisting of a full-core Lechler nozzle and a meshed fixed below to improve the rain drop distribution, was used. Prior to the field experiment, we compared this simulator with a drop former simulator in the laboratory at the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL) in terms of drop size distribution and kinetic energy. Thereby, we could estimate the accuracy of the field simulator. The rain drop size distribution and the total kinetic energy of the drops at a rain intensity of 60 mm h-1 were measured with a Joss-Waldvogel distrometer. To compare the effect of the two rain simulators as well as the influence of the soil texture on erosion and runoff rate, we used 6 silty soil monoliths and 6 clayish monoliths. To get comparable initial conditions, every soil monolith was irrigated only one time, starting at field capacity. The soil moisture was continuously recorded by TDR probes during the simulation. The comparison of the two rain simulators showed a close similarity in the drop size distributions. For both simulators, the most frequent drop size class is in the range of 1 mm in diameter. Natural rain typically shows a larger mean drop size at an intensity of 60 mm h-1. In comparison to the natural rain, the total kinetic energy of the simulated rain of both of the simulators was too small as well. These results lead to the conclusion, that the true simulation of a natural rain is hardly realizable

  15. Quantification of subsurface pore pressure through IODP drilling

    NASA Astrophysics Data System (ADS)

    Saffer, D. M.; Flemings, P. B.

    2010-12-01

    It is critical to understand the magnitude and distribution of subsurface pore fluid pressure: it controls effective stress and thus mechanical strength, slope stability, and sediment compaction. Elevated pore pressures also drive fluid flows that serve as agents of mass, solute, and heat fluxes. The Ocean Drilling Program (ODP) and Integrated Ocean Drilling Program (IODP) have provided important avenues to quantify pore pressure in a range of geologic and tectonic settings. These approaches include 1) analysis of continuous downhole logs and shipboard physical properties data to infer compaction state and in situ pressure and stress, 2) laboratory consolidation testing of core samples collected by drilling, 3) direct downhole measurements using pore pressure probes, 3) pore pressure and stress measurements using downhole tools that can be deployed in wide diameter pipe recently acquired for riser drilling, and 4) long-term monitoring of formation pore pressure in sealed boreholes within hydraulically isolated intervals. Here, we summarize key advances in quantification of subsurface pore pressure rooted in scientific drilling, highlighting with examples from subduction zones, the Gulf of Mexico, and the New Jersey continental shelf. At the Nankai, Costa Rican, and Barbados subduction zones, consolidation testing of cores samples, combined with analysis of physical properties data, indicates that even within a few km landward of the trench, pore pressures in and below plate boundary décollement zones reach a significant fraction of the lithostatic load (λ*=0.25-0.91). These results document a viable and quantifiable mechanism to explain the mechanical weakness of subduction décollements, and are corroborated by a small number of direct measurements in sealed boreholes and by inferences from seismic reflection data. Recent downhole measurements conducted during riser drilling using the modular formation dynamics tester wireline tool (MDT) in a forearc basin ~50

  16. High Resolution Quantification of Cellular Forces for Rigidity Sensing

    NASA Astrophysics Data System (ADS)

    Liu, Shuaimin

    This thesis describes a comprehensive study of understanding the mechanism of rigidity sensing by quantitative analysis using submicron pillar array substrates. From mechanobiology perspective, we explore and study molecular pathways involved in rigidity and force sensing at cell-matrix adhesions with regard to cancer, regeneration, and development by quantification methods. In Chapter 2 and 3, we developed fabrication and imaging techniques to enhance the performance of a submicron pillar device in terms of spatial and temporal measurement ability, and we discovered a correlation of rigidity sensing forces and corresponding proteins involved in the early rigidity sensing events. In Chapter 2, we introduced optical effect arising from submicron structure imaging, and we described a technique to identify the correct focal plane of pillar tip by fabricating a substrate with designed-offset pillars. From calibration result, we identified the correct focal plane that was previously overlooked, and verified our findings by other imaging techniques. In Chapter 3, we described several techniques to selectively functionalize elastomeric pillars top and compared these techniques in terms of purposes and fabrication complexity. Techniques introduced in this chapter included direct labeling, such as stamping of fluorescent substances (organic dye, nano-diamond, q-dot) to pillars top, as well as indirect labeling that selectively modify the surface of molds with either metal or fluorescent substances. In Chapter 4, we examined the characteristics of local contractility forces and identified the components formed a sarcomere like contractile unit (CU) that cells use to sense rigidity. CUs were found to be assembled at cell edge, contain myosin II, alpha-actinin, tropomodulin and tropomyosin (Tm), and resemble sarcomeres in size (˜2 mum) and function. Then we performed quantitative analysis of CUs to evaluate rigidity sensing activity over ˜8 hours time course and found that

  17. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    SciTech Connect

    Liu, Zhen; Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; van Bloemen Waanders, Bart Gustaaf; LaFranchi, Brian W.; Ivey, Mark D.; Schrader, Paul E.; Michelsen, Hope A.; Bambha, Ray P.

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  18. Photoacoustic sensor system for the quantification of soot aerosols (abstract)

    NASA Astrophysics Data System (ADS)

    Haisch, C.; Beck, H.; Niessner, R.

    2003-01-01

    The influence of soot particles on human health as well as global and local climate is well established by now. Hence, the need for fast and sensitive soot detection in urban and remote areas is obvious. The state of the art thermochemical detection methods for soot analysis is based on filter sampling and subsequent wet chemical analysis and combustion, which requires laborious and time consuming sample preparation. Due to the integration on a filter, a time-resolved analysis is not possible. The presented photoacoustic sensor system is optimized for a highly sensitive and fast on-line and in situ quantification of soot. Soot particles, as classical "black absorbers," absorb electromagnetic radiation over the whole spectrum. Two similar systems are introduced. The first system is designed for the development and testing of combustion engines, mainly the next generation of diesel engines. In the next decade, legal thresholds for extremely low particle emissions are foreseen. Their implementation will be only possible if a time-resolved soot detection with sufficient sensitivity can be realized as the highest particle emissions from diesel engines are generated only for seconds during load changes. During a load change, the emitted soot concentrations can rise several orders of magnitude for only a period of few seconds. The system combines a time resolution of 1 s (sampling rate 1 Hz) with an aerosol mass sensitivity better than 10 μg m-3. Up to a maximum dimension of about 800 nm the signal is independent of the particle size. The systems consist of two photoacoustic cells, which are operated in a differential mode to avoid cross sensitivities. The cells are built as acoustical resonators to increase sensitivity. A diode laser with a wavelength of 810 nm and an output power of 1.1 W is employed for excitation. Its collimated beam passes first through the reference cell and then through the measurement cell. To avoid condensation of water, the cells are heated to

  19. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  20. The exon quantification pipeline (EQP): a comprehensive approach to the quantification of gene, exon and junction expression from RNA-seq data.

    PubMed

    Schuierer, Sven; Roma, Guglielmo

    2016-09-19

    The quantification of transcriptomic features is the basis of the analysis of RNA-seq data. We present an integrated alignment workflow and a simple counting-based approach to derive estimates for gene, exon and exon-exon junction expression. In contrast to previous counting-based approaches, EQP takes into account only reads whose alignment pattern agrees with the splicing pattern of the features of interest. This leads to improved gene expression estimates as well as to the generation of exon counts that allow disambiguating reads between overlapping exons. Unlike other methods that quantify skipped introns, EQP offers a novel way to compute junction counts based on the agreement of the read alignments with the exons on both sides of the junction, thus providing a uniformly derived set of counts. We evaluated the performance of EQP on both simulated and real Illumina RNA-seq data and compared it with other quantification tools. Our results suggest that EQP provides superior gene expression estimates and we illustrate the advantages of EQP's exon and junction counts. The provision of uniformly derived high-quality counts makes EQP an ideal quantification tool for differential expression and differential splicing studies. EQP is freely available for download at https://github.com/Novartis/EQP-cluster.

  1. The exon quantification pipeline (EQP): a comprehensive approach to the quantification of gene, exon and junction expression from RNA-seq data

    PubMed Central

    Schuierer, Sven; Roma, Guglielmo

    2016-01-01

    The quantification of transcriptomic features is the basis of the analysis of RNA-seq data. We present an integrated alignment workflow and a simple counting-based approach to derive estimates for gene, exon and exon–exon junction expression. In contrast to previous counting-based approaches, EQP takes into account only reads whose alignment pattern agrees with the splicing pattern of the features of interest. This leads to improved gene expression estimates as well as to the generation of exon counts that allow disambiguating reads between overlapping exons. Unlike other methods that quantify skipped introns, EQP offers a novel way to compute junction counts based on the agreement of the read alignments with the exons on both sides of the junction, thus providing a uniformly derived set of counts. We evaluated the performance of EQP on both simulated and real Illumina RNA-seq data and compared it with other quantification tools. Our results suggest that EQP provides superior gene expression estimates and we illustrate the advantages of EQP's exon and junction counts. The provision of uniformly derived high-quality counts makes EQP an ideal quantification tool for differential expression and differential splicing studies. EQP is freely available for download at https://github.com/Novartis/EQP-cluster. PMID:27302131

  2. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  3. Simple and accurate quantification of quantum dots via single-particle counting.

    PubMed

    Zhang, Chun-yang; Johnson, Lawrence W

    2008-03-26

    Quantification of quantum dots (QDs) is essential to the quality control of QD synthesis, development of QD-based LEDs and lasers, functionalizing of QDs with biomolecules, and engineering of QDs for biological applications. However, simple and accurate quantification of QD concentration in a variety of buffer solutions and in complex mixtures still remains a critical technological challenge. Here, we introduce a new methodology for quantification of QDs via single-particle counting, which is conceptually different from established UV-vis absorption and fluorescence spectrum techniques where large amounts of purified QDs are needed and specific absorption coefficient or quantum yield values are necessary for measurements. We demonstrate that single-particle counting allows us to nondiscriminately quantify different kinds of QDs by their distinct fluorescence burst counts in a variety of buffer solutions regardless of their composition, structure, and surface modifications, and without the necessity of absorption coefficient and quantum yield values. This single-particle counting can also unambiguously quantify individual QDs in a complex mixture, which is practically impossible for both UV-vis absorption and fluorescence spectrum measurements. Importantly, the application of this single-particle counting is not just limited to QDs but also can be extended to fluorescent microspheres, quantum dot-based microbeads, and fluorescent nano rods, some of which currently lack efficient quantification methods.

  4. Novel primers and PCR protocols for the specific detection and quantification of Sphingobium suberifaciens in situ

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The pathogen causing corky root on lettuce, Sphingobium suberifaciens, is recalcitrant to standard epidemiological methods. Primers were selected from 16S rDNA sequences useful for the specific detection and quantification of S. suberifaciens. Conventional (PCR) and quantitative (qPCR) PCR protocols...

  5. Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...

  6. Absolute and direct microRNA quantification using DNA-gold nanoparticle probes.

    PubMed

    Degliangeli, Federica; Kshirsagar, Prakash; Brunetti, Virgilio; Pompa, Pier Paolo; Fiammengo, Roberto

    2014-02-12

    DNA-gold nanoparticle probes are implemented in a simple strategy for direct microRNA (miRNA) quantification. Fluorescently labeled DNA-probe strands are immobilized on PEGylated gold nanoparticles (AuNPs). In the presence of target miRNA, DNA-RNA heteroduplexes are formed and become substrate for the endonuclease DSN (duplex-specific nuclease). Enzymatic hydrolysis of the DNA strands yields a fluorescence signal due to diffusion of the fluorophores away from the gold surface. We show that the molecular design of our DNA-AuNP probes, with the DNA strands immobilized on top of the PEG-based passivation layer, results in nearly unaltered enzymatic activity toward immobilized heteroduplexes compared to substrates free in solution. The assay, developed in a real-time format, allows absolute quantification of as little as 0.2 fmol of miR-203. We also show the application of the assay for direct quantification of cancer-related miR-203 and miR-21 in samples of extracted total RNA from cell cultures. The possibility of direct and absolute quantification may significantly advance the use of microRNAs as biomarkers in the clinical praxis.

  7. Geometric Foundation and Quantification of the Flow in a Verbal Expression.

    ERIC Educational Resources Information Center

    Bierschenk, Bernhard

    This paper presents the geometric foundation and quantification of Agent-action-Objective (AaO) kinematics. The meaningfulness of studying the flows in verbal expressions through splitting and splicing the strings in a verbal flow related to the fact that free parameters are not needed since it is not required that the presented methodological…

  8. Being Something: Prospects for a Property-Based Approach to Predicative Quantification

    ERIC Educational Resources Information Center

    Rieppel, Michael Olivier

    2013-01-01

    Few questions concerning the character of our talk about the world are more basic than how predicates combine with names to form truth-evaluable sentences. One particularly intriguing fact that any account of predication needs to make room for is that natural language allows for quantification into predicate position, through constructions like…

  9. Quantification of Dehalospirillum multivorans in Mixed-Culture Biofilms with an Enzyme-Linked Immunosorbent Assay

    PubMed Central

    Bauer-Kreisel, P.; Eisenbeis, M.; Scholz-Muramatsu, H.

    1996-01-01

    A fast, highly selective and sensitive method to quantify specific biomasses in mixed-culture biofilms is described. It consists of detachment of a biofilm from its support material, resolution of the detached biofilm flocs in order to separate the enclosed cells and antigens, and quantification of specific biomass by an enzyme-linked immunosorbent assay. PMID:16535389

  10. Comparison of biochemical and microscopic methods for quantification of mycorrhizal fungi in soil and roots

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...

  11. A whole-cell electrochemical biosensing system based on bacterial inward electron flow for fumarate quantification.

    PubMed

    Si, Rong-Wei; Zhai, Dan-Dan; Liao, Zhi-Hong; Gao, Lu; Yong, Yang-Chun

    2015-06-15

    Fumarate is of great importance as it is an oncometabolite as well as food spoilage indicator. However, cost-effective and fast quantification method for fumarate is lacking although it is urgently required. This work developed an electrochemical whole-cell biosensing system for fumarate quantification. A sensitive inwards electric output (electron flow from electrode into bacteria) responded to fumarate in Shewanella oneidensis MR-1 was characterized, and an electrochemical fumarate biosensing system was developed without genetic engineering. The biosensing system delivered symmetric current peak immediately upon fumarate addition, where the peak area increased in proportion to the increasing fumarate concentration with a wide range of 2 μM-10 mM (R(2)=0.9997). The limit of detection (LOD) and the limit of quantification (LOQ) are 0.83 μM and 1.2 μM, respectively. This biosensing system displayed remarkable specificity to fumarate against other possible interferences. It was also successfully applied to samples of apple juice and kidney tissue. This study added new dimension to electrochemical biosensor design, and provide a simple, cost-effective, fast and robust tool for fumarate quantification.

  12. A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.

    EPA Science Inventory

    John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192).

    We have developed a simple, mild extraction procedure using methanol which, when...

  13. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    PubMed

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. PMID:25603128

  14. A Clinical Method for the Detection and Quantification of Quick Respiratory Hyperkinesia

    ERIC Educational Resources Information Center

    Hixon, Thomas J.; Hoit, Jeannette D.

    2006-01-01

    Purpose: Quick respiratory hyperkinesia can be difficult to detect with the naked eye. A clinical method is described for the detection and quantification of quick respiratory hyperkinesia. Method: Flow at the airway opening is sensed during spontaneous apnea (rest), voluntary breath holding (postural fixation), and voluntary volume displacement…

  15. Quantification of infectious bronchitis coronavirus by titration in vitro and in ovo.

    PubMed

    Kint, Joeri; Maier, Helena Jane; Jagt, Erik

    2015-01-01

    Quantification of the number of infectious viruses in a sample is a basic virological technique. In this chapter we provide a detailed description of three techniques to estimate the number of viable infectious avian coronaviruses in a sample. All three techniques are serial dilution assays, better known as titrations.

  16. New approach for the quantification of processed animal proteins in feed using light microscopy.

    PubMed

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed. PMID:20432096

  17. Identification and absolute quantification of enzymes in laundry detergents by liquid chromatography tandem mass spectrometry.

    PubMed

    Gaubert, Alexandra; Jeudy, Jérémy; Rougemont, Blandine; Bordes, Claire; Lemoine, Jérôme; Casabianca, Hervé; Salvador, Arnaud

    2016-07-01

    In a stricter legislative context, greener detergent formulations are developed. In this way, synthetic surfactants are frequently replaced by bio-sourced surfactants and/or used at lower concentrations in combination with enzymes. In this paper, a LC-MS/MS method was developed for the identification and quantification of enzymes in laundry detergents. Prior to the LC-MS/MS analyses, a specific sample preparation protocol was developed due to matrix complexity (high surfactant percentages). Then for each enzyme family mainly used in detergent formulations (protease, amylase, cellulase, and lipase), specific peptides were identified on a high resolution platform. A LC-MS/MS method was then developed in selected reaction monitoring (SRM) MS mode for the light and corresponding heavy peptides. The method was linear on the peptide concentration ranges 25-1000 ng/mL for protease, lipase, and cellulase; 50-1000 ng/mL for amylase; and 5-1000 ng/mL for cellulase in both water and laundry detergent matrices. The application of the developed analytical strategy to real commercial laundry detergents enabled enzyme identification and absolute quantification. For the first time, identification and absolute quantification of enzymes in laundry detergent was realized by LC-MS/MS in a single run. Graphical Abstract Identification and quantification of enzymes by LC-MS/MS.

  18. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 2

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  19. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING LIGHT TRANSMISSION VISUALIZATION METHOD

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  20. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 3

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  1. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 1

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  2. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    PubMed

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems.

  3. Quantification of plasma exosome is a potential prognostic marker for esophageal squamous cell carcinoma

    PubMed Central

    Matsumoto, Yasunori; Kano, Masayuki; Akutsu, Yasunori; Hanari, Naoyuki; Hoshino, Isamu; Murakami, Kentaro; Usui, Akihiro; Suito, Hiroshi; Takahashi, Masahiko; Otsuka, Ryota; Xin, Hu; Komatsu, Aki; Iida, Keiko; Matsubara, Hisahiro

    2016-01-01

    Exosomes play important roles in cancer progression. Although its contents (e.g., proteins and microRNAs) have been focused on in cancer research, particularly as potential diagnostic markers, the exosome behavior and methods for exosome quantification remain unclear. In the present study, we analyzed the tumor-derived exosome behavior and assessed the quantification of exosomes in patient plasma as a biomarker for esophageal squamous cell carcinoma (ESCC). A CD63-GFP expressing human ESCC cell line (TE2-CD63-GFP) was made by transfection, and mouse subcutaneous tumor models were established. Fluorescence imaging was performed on tumors and plasma exosomes harvested from mice. GFP-positive small vesicles were confirmed in the plasma obtained from TE2-CD63-GFP tumor-bearing mice. Patient plasma was collected in Chiba University Hospital (n=86). Exosomes were extracted from 100 µl of the plasma and quantified by acetylcholinesterase (AChE) activity. The relationship between exosome quantification and the patient clinical characteristics was assessed. The quantification of exosomes isolated from the patient plasma revealed that esophageal cancer patients (n=66) expressed higher exosome levels than non-malignant patients (n=20) (P=0.0002). Although there was no correlation between the tumor progression and the exosome levels, exosome number was the independent prognostic marker and low levels of exosome predicted a poor prognosis (P=0.03). In conclusion, exosome levels may be useful as an independent prognostic factor for ESCC patients. PMID:27599779

  4. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  5. Carbon Nanotubes Released from an Epoxy-Based Nanocomposite: Quantification and Particle Toxicity.

    PubMed

    Schlagenhauf, Lukas; Buerki-Thurnherr, Tina; Kuo, Yu-Ying; Wichser, Adrian; Nüesch, Frank; Wick, Peter; Wang, Jing

    2015-09-01

    Studies combining both the quantification of free nanoparticle release and the toxicological investigations of the released particles from actual nanoproducts in a real-life exposure scenario are urgently needed, yet very rare. Here, a new measurement method was established to quantify the amount of free-standing and protruding multiwalled carbon nanotubes (MWCNTs) in the respirable fraction of particles abraded from a MWCNT-epoxy nanocomposite. The quantification approach involves the prelabeling of MWCNTs with lead ions, nanocomposite production, abrasion and collection of the inhalable particle fraction, and quantification of free-standing and protruding MWCNTs by measuring the concentration of released lead ions. In vitro toxicity studies for genotoxicity, reactive oxygen species formation, and cell viability were performed using A549 human alveolar epithelial cells and THP-1 monocyte-derived macrophages. The quantification experiment revealed that in the respirable fraction of the abraded particles, approximately 4000 ppm of the MWCNTs were released as exposed MWCNTs (which could contact lung cells upon inhalation) and approximately 40 ppm as free-standing MWCNTs in the worst-case scenario. The release of exposed MWCNTs was lower for nanocomposites containing agglomerated MWCNTs. The toxicity tests revealed that the abraded particles did not induce any acute cytotoxic effects.

  6. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  7. En route to traceable reference standards for surface group quantifications by XPS, NMR and fluorescence spectroscopy.

    PubMed

    Hennig, Andreas; Dietrich, Paul M; Hemmann, Felix; Thiele, Thomas; Borcherding, Heike; Hoffmann, Angelika; Schedler, Uwe; Jäger, Christian; Resch-Genger, Ute; Unger, Wolfgang E S

    2015-03-21

    The fluorine content of polymer particles labelled with 2,2,2-trifluoroethylamine was reliably quantified with overlapping sensitivity ranges by XPS and solid-state NMR. This provides a first step towards reference materials for the metrological traceability of surface group quantifications. The extension of this concept to fluorescence spectroscopy is illustrated.

  8. Transcriptome assembly and quantification from Ion Torrent RNA-Seq data

    PubMed Central

    2014-01-01

    Background High throughput RNA sequencing (RNA-Seq) can generate whole transcriptome information at the single transcript level providing a powerful tool with multiple interrelated applications including transcriptome reconstruction and quantification. The sequences of novel transcripts can be reconstructed from deep RNA-Seq data, but this is computationally challenging due to sequencing errors, uneven coverage of expressed transcripts, and the need to distinguish between highly similar transcripts produced by alternative splicing. Another challenge in transcriptomic analysis comes from the ambiguities in mapping reads to transcripts. Results We present MaLTA, a method for simultaneous transcriptome assembly and quantification from Ion Torrent RNA-Seq data. Our approach explores transcriptome structure and incorporates a maximum likelihood model into the assembly and quantification procedure. A new version of the IsoEM algorithm suitable for Ion Torrent RNA-Seq reads is used to accurately estimate transcript expression levels. The MaLTA-IsoEM tool is publicly available at: http://alan.cs.gsu.edu/NGS/?q=malta Conclusions Experimental results on both synthetic and real datasets show that Ion Torrent RNA-Seq data can be successfully used for transcriptome analyses. Experimental results suggest increased transcriptome assembly and quantification accuracy of MaLTA-IsoEM solution compared to existing state-of-the-art approaches. PMID:25082147

  9. Quantification of taurine in energy drinks using ¹H NMR.

    PubMed

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks.

  10. Detection and Quantification of Human Fecal Pollution with Real-Time PCR

    EPA Science Inventory

    ABSTRACT Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described ...

  11. DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...

  12. Quantification of fungicides in snow-melt runoff from turf: A comparison of four extraction methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A variety of pesticides are used to control diverse stressors to turf. These pesticides have a wide range in physical and chemical properties. The objective of this project was to develop an extraction and analysis method for quantification of chlorothalonil and PCNB (pentachloronitrobenzene), two p...

  13. Towards Quantification of Functional Breast Images Using Dedicated SPECT With Non-Traditional Acquisition Trajectories

    PubMed Central

    Perez, Kristy L.; Cutler, Spencer J.; Madhav, Priti; Tornai, Martin P.

    2012-01-01

    Quantification of radiotracer uptake in breast lesions can provide valuable information to physicians in deciding patient care or determining treatment efficacy. Physical processes (e.g., scatter, attenuation), detector/collimator characteristics, sampling and acquisition trajectories, and reconstruction artifacts contribute to an incorrect measurement of absolute tracer activity and distribution. For these experiments, a cylinder with three syringes of varying radioactivity concentration, and a fillable 800 mL breast with two lesion phantoms containing aqueous 99mTc pertechnetate were imaged using the SPECT sub-system of the dual-modality SPECT-CT dedicated breast scanner. SPECT images were collected using a compact CZT camera with various 3D acquisitions including vertical axis of rotation, 30° tilted, and complex sinusoidal trajectories. Different energy windows around the photopeak were quantitatively compared, along with appropriate scatter energy windows, to determine the best quantification accuracy after attenuation and dual-window scatter correction. Measured activity concentrations in the reconstructed images for syringes with greater than 10 µCi /mL corresponded to within 10% of the actual dose calibrator measured activity concentration for ±4% and ±8% photopeak energy windows. The same energy windows yielded lesion quantification results within 10% in the breast phantom as well. Results for the more complete complex sinsusoidal trajectory are similar to the simple vertical axis acquisition, and additionally allows both anterior chest wall sampling, no image distortion, and reasonably accurate quantification. PMID:22262925

  14. Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students

    ERIC Educational Resources Information Center

    Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.

    2014-01-01

    A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…

  15. Complex Quantification in Structured Query Language (SQL): A Tutorial Using Relational Calculus

    ERIC Educational Resources Information Center

    Kawash, Jalal

    2004-01-01

    The Structured Query Language (SQL) forms a substantial component of introductory database courses and is supported by almost every commercial database product. One disadvantage of SQL is that it does not provide a universal quantification construct. Queries that have twisted universal and existential quantifiers can be stunning for students,…

  16. Towards Quantification of Functional Breast Images Using Dedicated SPECT With Non-Traditional Acquisition Trajectories.

    PubMed

    Perez, Kristy L; Cutler, Spencer J; Madhav, Priti; Tornai, Martin P

    2011-10-01

    Quantification of radiotracer uptake in breast lesions can provide valuable information to physicians in deciding patient care or determining treatment efficacy. Physical processes (e.g., scatter, attenuation), detector/collimator characteristics, sampling and acquisition trajectories, and reconstruction artifacts contribute to an incorrect measurement of absolute tracer activity and distribution. For these experiments, a cylinder with three syringes of varying radioactivity concentration, and a fillable 800 mL breast with two lesion phantoms containing aqueous (99m)Tc pertechnetate were imaged using the SPECT sub-system of the dual-modality SPECT-CT dedicated breast scanner. SPECT images were collected using a compact CZT camera with various 3D acquisitions including vertical axis of rotation, 30° tilted, and complex sinusoidal trajectories. Different energy windows around the photopeak were quantitatively compared, along with appropriate scatter energy windows, to determine the best quantification accuracy after attenuation and dual-window scatter correction. Measured activity concentrations in the reconstructed images for syringes with greater than 10 µCi /mL corresponded to within 10% of the actual dose calibrator measured activity concentration for ±4% and ±8% photopeak energy windows. The same energy windows yielded lesion quantification results within 10% in the breast phantom as well. Results for the more complete complex sinsusoidal trajectory are similar to the simple vertical axis acquisition, and additionally allows both anterior chest wall sampling, no image distortion, and reasonably accurate quantification.

  17. Immobilized Metal Affinity Chromatography Coupled to Multiple Reaction Monitoring Enables Reproducible Quantification of Phospho-signaling.

    PubMed

    Kennedy, Jacob J; Yan, Ping; Zhao, Lei; Ivey, Richard G; Voytovich, Uliana J; Moore, Heather D; Lin, Chenwei; Pogosova-Agadjanyan, Era L; Stirewalt, Derek L; Reding, Kerryn W; Whiteaker, Jeffrey R; Paulovich, Amanda G

    2016-02-01

    A major goal in cell signaling research is the quantification of phosphorylation pharmacodynamics following perturbations. Traditional methods of studying cellular phospho-signaling measure one analyte at a time with poor standardization, rendering them inadequate for interrogating network biology and contributing to the irreproducibility of preclinical research. In this study, we test the feasibility of circumventing these issues by coupling immobilized metal affinity chromatography (IMAC)-based enrichment of phosphopeptides with targeted, multiple reaction monitoring (MRM) mass spectrometry to achieve precise, specific, standardized, multiplex quantification of phospho-signaling responses. A multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay targeting phospho-analytes responsive to DNA damage was configured, analytically characterized, and deployed to generate phospho-pharmacodynamic curves from primary and immortalized human cells experiencing genotoxic stress. The multiplexed assays demonstrated linear ranges of ≥3 orders of magnitude, median lower limit of quantification of 0.64 fmol on column, median intra-assay variability of 9.3%, median inter-assay variability of 12.7%, and median total CV of 16.0%. The multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay enabled robust quantification of 107 DNA damage-responsive phosphosites from human cells following DNA damage. The assays have been made publicly available as a resource to the community. The approach is generally applicable, enabling wide interrogation of signaling networks. PMID:26621847

  18. Semi-automated quantification of axonal densities in labeled CNS tissue.

    PubMed

    Grider, Michael H; Chen, Qin; Shine, H David

    2006-09-15

    Current techniques used to quantify axons often rely upon manual quantification or potentially expensive commercially available programs for automated quantification. We describe a computerized method for the detection and quantification of axons in the rat CNS using readily available free software. Feature J, a java-based plug-in to the imaging software NIH Image J, faithfully detects linear structures such as axons in confocal or bright-field images using a Hessian-based algorithm. We validated the method by comparing values obtained by manual and automated analyses of axons induced to grow in response to neurotrophin over-expression in the rat spinal cord. We also demonstrated that the program can be used to quantify neurotrophin-induced growth of lesioned serotonergic axons in the rat cortex, where manual measurement would be impractical due to dense axonal growth. The use of this software suite provided faster and less biased quantification of labeled axons in comparison to manual measurements at no cost.

  19. Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.

    PubMed

    Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro

    2016-09-01

    Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards. PMID:27432553

  20. A tool for pattern information extraction and defect quantification from crystal structures

    NASA Astrophysics Data System (ADS)

    Okuyan, Erhan; Okuyan, Erkan

    2015-02-01

    In this paper, we present a revised version of BilKristal 2.0 tool. We added defect quantification functionality to assess crystalline defects. We improved visualization capabilities by adding transparency support and runtime visibility sorting. Discovered bugs are fixed and small performance optimizations are made.

  1. How to Improve Your Impact Factor: Questioning the Quantification of Academic Quality

    ERIC Educational Resources Information Center

    Smeyers, Paul; Burbules, Nicholas C.

    2011-01-01

    A broad-scale quantification of the measure of quality for scholarship is under way. This trend has fundamental implications for the future of academic publishing and employment. In this essay we want to raise questions about these burgeoning practices, particularly how they affect philosophy of education and similar sub-disciplines. First,…

  2. Quantification of event-related desynchronization/synchronization at low frequencies in a semantic memory task.

    PubMed

    Gómez, Juan; Aguilar, Mónica; Horna, Eduardo; Minguez, Javier

    2012-01-01

    Although several techniques have been developed for the visualization of EEG event-related desynchronization/synchronization (ERD/ERS) in both time and frequency domains, none of the quantification methods takes advantage of the time and frequency resolution at the same time. Existing techniques for the quantification of the ERD/ERS changes compute the average EEG power increase/decrease relative to certain reference value, over fixed time intervals and/or frequency bands (either fixed or individualized). Inaccuracy in the computation of these frequency bands (where the process is actually measured) in combination with the averaging process over time may lead to errors in the computation of any ERD/ERS quantification parameter. In this paper, we present a novel method for the automatic, individual and exact quantification of the most significant ERD/ERS region within a given window of the time-frequency domain. The method is exemplified by quantifying the ERS at low frequencies of 10 subjects performing a semantic memory task, and compared with existing techniques. PMID:23366438

  3. Sensitive Targeted Quantification of ERK Phosphorylation Dynamics and Stoichiometry in Human Cells without Affinity Enrichment

    SciTech Connect

    Shi, Tujin; Gao, Yuqian; Gaffrey, Matthew J.; Nicora, Carrie D.; Fillmore, Thomas L.; Chrisler, William B.; Gritsenko, Marina A.; Wu, Chaochao; He, Jintang; Bloodsworth, Kent J.; Zhao, Rui; Camp II, David G.; Liu, Tao; Rodland, Karin D.; Smith, Richard D.; Wiley, H. Steven; Qian, Weijun

    2014-12-17

    Mass spectrometry-based targeted quantification is a promising technology for site-specific quantification of posttranslational modifications (PTMs). However, a major constraint of most targeted MS approaches is the limited sensitivity for quantifying low-abundance PTMs, requiring the use of affinity reagents to enrich specific PTMs. Herein, we demonstrate the direct site-specific quantification of ERK phosphorylation isoforms (pT, pY, pTpY) and their relative stoichiometries using a highly sensitive targeted MS approach termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM). PRISM provides effective enrichment of target peptides within a given fraction from complex biological matrix with minimal sample losses, followed by selected reaction monitoring (SRM) quantification. The PRISM-SRM approach enabled direct quantification of ERK phosphorylation in human mammary epithelial cells (HMEC) from as little as 25 µg tryptic peptides from whole cell lysates. Compared to immobilized metal-ion affinity chromatography, PRISM provided >10-fold improvement in signal intensities, presumably due to the better peptide recovery of PRISM for handling small size samples. This approach was applied to quantify ERK phosphorylation dynamics in HMEC treated by different doses of EGF at both the peak activation (10 min) and steady state (2 h). At 10 min, the maximal ERK activation was observed with 0.3 ng/mL dose, whereas the maximal steady state level of ERK activation at 2 h was at 3 ng/ml dose, corresponding to 1200 and 9000 occupied receptors, respectively. At 10 min, the maximally activated pTpY isoform represented ~40% of total ERK, falling to less than 10% at 2 h. The time course and dose-response profiles of individual phosphorylated ERK isoforms indicated that singly phosphorylated pT-ERK never increases significantly, while the increase of pY-ERK paralleled that of pTpY-ERK. This data supports for a processive, rather than

  4. Sensitive Targeted Quantification of ERK Phosphorylation Dynamics and Stoichiometry in Human Cells without Affinity Enrichment

    DOE PAGES

    Shi, Tujin; Gao, Yuqian; Gaffrey, Matthew J.; Nicora, Carrie D.; Fillmore, Thomas L.; Chrisler, William B.; Gritsenko, Marina A.; Wu, Chaochao; He, Jintang; Bloodsworth, Kent J.; et al

    2014-12-17

    Mass spectrometry-based targeted quantification is a promising technology for site-specific quantification of posttranslational modifications (PTMs). However, a major constraint of most targeted MS approaches is the limited sensitivity for quantifying low-abundance PTMs, requiring the use of affinity reagents to enrich specific PTMs. Herein, we demonstrate the direct site-specific quantification of ERK phosphorylation isoforms (pT, pY, pTpY) and their relative stoichiometries using a highly sensitive targeted MS approach termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM). PRISM provides effective enrichment of target peptides within a given fraction from complex biological matrix with minimal sample losses, followed by selected reactionmore » monitoring (SRM) quantification. The PRISM-SRM approach enabled direct quantification of ERK phosphorylation in human mammary epithelial cells (HMEC) from as little as 25 µg tryptic peptides from whole cell lysates. Compared to immobilized metal-ion affinity chromatography, PRISM provided >10-fold improvement in signal intensities, presumably due to the better peptide recovery of PRISM for handling small size samples. This approach was applied to quantify ERK phosphorylation dynamics in HMEC treated by different doses of EGF at both the peak activation (10 min) and steady state (2 h). At 10 min, the maximal ERK activation was observed with 0.3 ng/mL dose, whereas the maximal steady state level of ERK activation at 2 h was at 3 ng/ml dose, corresponding to 1200 and 9000 occupied receptors, respectively. At 10 min, the maximally activated pTpY isoform represented ~40% of total ERK, falling to less than 10% at 2 h. The time course and dose-response profiles of individual phosphorylated ERK isoforms indicated that singly phosphorylated pT-ERK never increases significantly, while the increase of pY-ERK paralleled that of pTpY-ERK. This data supports for a processive, rather than

  5. Quantification Of Erosion Rates Of Agriculturally Used Soils By Artificial

    NASA Astrophysics Data System (ADS)

    Jha, Abhinand

    2010-05-01

    . PIC Figure 4: Profiles of sediment calculated for different erosion rates by Cs-137 within the ploughed soil 3 Conclusions and outlook Erosion rates for agricultural soils at Young Moraine regions of North-East Germany were determined by using two radionuclides, 137Cs and 7Be. In combination, the two radionuclides provide a valuable means of investigating soil erosion and assessing erosion risk in the study area. Potentials and limitations of the erosion measurement techniques using radiotracers are discussed in this study. The models used to quantify erosion rates using 137Cs and 7Be were studied. Erosion rates calculated by theses models are difficult to measure over a period of 50 years. A validation of these erosion rates for the time period (50 years) used in the 137Cs-based models will give a new perspective to the use of soil erosion modeling. Most of the regions in India are suffering from high erosion rates [7]. By using the new techniques in erosion quantification the land management practices can be improved and the erosion risk can be reduced in India.

  6. Self-Digitization Microfluidic Chip for Absolute Quantification of mRNA in Single Cells

    PubMed Central

    2015-01-01

    Quantification of mRNA in single cells provides direct insight into how intercellular heterogeneity plays a role in disease progression and outcomes. Quantitative polymerase chain reaction (qPCR), the current gold standard for evaluating gene expression, is insufficient for providing absolute measurement of single-cell mRNA transcript abundance. Challenges include difficulties in handling small sample volumes and the high variability in measurements. Microfluidic digital PCR provides far better sensitivity for minute quantities of genetic material, but the typical format of this assay does not allow for counting of the absolute number of mRNA transcripts samples taken from single cells. Furthermore, a large fraction of the sample is often lost during sample handling in microfluidic digital PCR. Here, we report the absolute quantification of single-cell mRNA transcripts by digital, one-step reverse transcription PCR in a simple microfluidic array device called the self-digitization (SD) chip. By performing the reverse transcription step in digitized volumes, we find that the assay exhibits a linear signal across a wide range of total RNA concentrations and agrees well with standard curve qPCR. The SD chip is found to digitize a high percentage (86.7%) of the sample for single-cell experiments. Moreover, quantification of transferrin receptor mRNA in single cells agrees well with single-molecule fluorescence in situ hybridization experiments. The SD platform for absolute quantification of single-cell mRNA can be optimized for other genes and may be useful as an independent control method for the validation of mRNA quantification techniques. PMID:25390242

  7. Self-digitization microfluidic chip for absolute quantification of mRNA in single cells.

    PubMed

    Thompson, Alison M; Gansen, Alexander; Paguirigan, Amy L; Kreutz, Jason E; Radich, Jerald P; Chiu, Daniel T

    2014-12-16

    Quantification of mRNA in single cells provides direct insight into how intercellular heterogeneity plays a role in disease progression and outcomes. Quantitative polymerase chain reaction (qPCR), the current gold standard for evaluating gene expression, is insufficient for providing absolute measurement of single-cell mRNA transcript abundance. Challenges include difficulties in handling small sample volumes and the high variability in measurements. Microfluidic digital PCR provides far better sensitivity for minute quantities of genetic material, but the typical format of this assay does not allow for counting of the absolute number of mRNA transcripts samples taken from single cells. Furthermore, a large fraction of the sample is often lost during sample handling in microfluidic digital PCR. Here, we report the absolute quantification of single-cell mRNA transcripts by digital, one-step reverse transcription PCR in a simple microfluidic array device called the self-digitization (SD) chip. By performing the reverse transcription step in digitized volumes, we find that the assay exhibits a linear signal across a wide range of total RNA concentrations and agrees well with standard curve qPCR. The SD chip is found to digitize a high percentage (86.7%) of the sample for single-cell experiments. Moreover, quantification of transferrin receptor mRNA in single cells agrees well with single-molecule fluorescence in situ hybridization experiments. The SD platform for absolute quantification of single-cell mRNA can be optimized for other genes and may be useful as an independent control method for the validation of mRNA quantification techniques.

  8. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  9. Network-Based Isoform Quantification with RNA-Seq Data for Cancer Transcriptome Analysis.

    PubMed

    Zhang, Wei; Chang, Jae-Woong; Lin, Lilong; Minn, Kay; Wu, Baolin; Chien, Jeremy; Yong, Jeongsik; Zheng, Hui; Kuang, Rui

    2015-12-01

    High-throughput mRNA sequencing (RNA-Seq) is widely used for transcript quantification of gene isoforms. Since RNA-Seq data alone is often not sufficient to accurately identify the read origins from the isoforms for quantification, we propose to explore protein domain-domain interactions as prior knowledge for integrative analysis with RNA-Seq data. We introduce a Network-based method for RNA-Seq-based Transcript Quantification (Net-RSTQ) to integrate protein domain-domain interaction network with short read alignments for transcript abundance estimation. Based on our observation that the abundances of the neighboring isoforms by domain-domain interactions in the network are positively correlated, Net-RSTQ models the expression of the neighboring transcripts as Dirichlet priors on the likelihood of the observed read alignments against the transcripts in one gene. The transcript abundances of all the genes are then jointly estimated with alternating optimization of multiple EM problems. In simulation Net-RSTQ effectively improved isoform transcript quantifications when isoform co-expressions correlate with their interactions. qRT-PCR results on 25 multi-isoform genes in a stem cell line, an ovarian cancer cell line, and a breast cancer cell line also showed that Net-RSTQ estimated more consistent isoform proportions with RNA-Seq data. In the experiments on the RNA-Seq data in The Cancer Genome Atlas (TCGA), the transcript abundances estimated by Net-RSTQ are more informative for patient sample classification of ovarian cancer, breast cancer and lung cancer. All experimental results collectively support that Net-RSTQ is a promising approach for isoform quantification. Net-RSTQ toolbox is available at http://compbio.cs.umn.edu/Net-RSTQ/.

  10. Tissue elasticity quantification by acoustic radiation force impulse for the assessment of renal allograft function.

    PubMed

    He, Wan-Yuan; Jin, Yun-Jie; Wang, Wen-Ping; Li, Chao-Lun; Ji, Zheng-Biao; Yang, Cheng

    2014-02-01

    Acoustic radiation force impulse (ARFI) quantification, a novel ultrasound-based elastography method, has been used to measure liver fibrosis. However, few studies have been performed on the use of ARFI quantification in kidney examinations. We evaluated renal allograft stiffness using ARFI quantification in patients with stable renal function (n = 52) and those with biopsy-proven allograft dysfunction (n = 50). ARFI quantification, given as shear wave velocity (SWV), was performed. The resistance index (RI) was calculated by pulsed-wave Doppler ultrasound, and clinical and laboratory data were collected. Morphologic changes in transplanted kidneys were diagnosed by an independent pathologist. Mean SWV was more significantly negatively correlated with estimated glomerular filtration rate (eGFR) (r = -0.657, p < 0.0001) than was RI (r = -0.429, p = 0.0004) in transplanted kidneys. Receiver operating characteristic curve analysis revealed that the sensitivity and specificity of quantitative ultrasound in the diagnosis of renal allograft dysfunction were 72.0% and 86.5% (cutoff value = 2.625), respectively. The latter values were better than those of RI, which were 62.0% and 69.2% (cutoff value = 0.625), respectively. The coefficient of variation for repeat SWV measurements of the middle part of transplanted kidney was 8.64%, and inter-observer agreement on SWV was good (Bland-Altman method, ICC = 0.890). In conclusion, tissue elasticity quantification by ARFI is more accurate than the RI in diagnosing renal allograft function.

  11. A real-time PCR assay for detection and quantification of Verticillium dahliae in spinach seed.

    PubMed

    Duressa, Dechassa; Rauscher, Gilda; Koike, Steven T; Mou, Beiquan; Hayes, Ryan J; Maruthachalam, Karunakaran; Subbarao, Krishna V; Klosterman, Steven J

    2012-04-01

    Verticillium dahliae is a soilborne fungus that causes Verticillium wilt on multiple crops in central coastal California. Although spinach crops grown in this region for fresh and processing commercial production do not display Verticillium wilt symptoms, spinach seeds produced in the United States or Europe are commonly infected with V. dahliae. Planting of the infected seed increases the soil inoculum density and may introduce exotic strains that contribute to Verticillium wilt epidemics on lettuce and other crops grown in rotation with spinach. A sensitive, rapid, and reliable method for quantification of V. dahliae in spinach seed may help identify highly infected lots, curtail their planting, and minimize the spread of exotic strains via spinach seed. In this study, a quantitative real-time polymerase chain reaction (qPCR) assay was optimized and employed for detection and quantification of V. dahliae in spinach germplasm and 15 commercial spinach seed lots. The assay used a previously reported V. dahliae-specific primer pair (VertBt-F and VertBt-R) and an analytical mill for grinding tough spinach seed for DNA extraction. The assay enabled reliable quantification of V. dahliae in spinach seed, with a sensitivity limit of ≈1 infected seed per 100 (1.3% infection in a seed lot). The quantification was highly reproducible between replicate samples of a seed lot and in different real-time PCR instruments. When tested on commercial seed lots, a pathogen DNA content corresponding to a quantification cycle value of ≥31 corresponded with a percent seed infection of ≤1.3%. The assay is useful in qualitatively assessing seed lots for V. dahliae infection levels, and the results of the assay can be helpful to guide decisions on whether to apply seed treatments.

  12. Network-Based Isoform Quantification with RNA-Seq Data for Cancer Transcriptome Analysis.

    PubMed

    Zhang, Wei; Chang, Jae-Woong; Lin, Lilong; Minn, Kay; Wu, Baolin; Chien, Jeremy; Yong, Jeongsik; Zheng, Hui; Kuang, Rui

    2015-12-01

    High-throughput mRNA sequencing (RNA-Seq) is widely used for transcript quantification of gene isoforms. Since RNA-Seq data alone is often not sufficient to accurately identify the read origins from the isoforms for quantification, we propose to explore protein domain-domain interactions as prior knowledge for integrative analysis with RNA-Seq data. We introduce a Network-based method for RNA-Seq-based Transcript Quantification (Net-RSTQ) to integrate protein domain-domain interaction network with short read alignments for transcript abundance estimation. Based on our observation that the abundances of the neighboring isoforms by domain-domain interactions in the network are positively correlated, Net-RSTQ models the expression of the neighboring transcripts as Dirichlet priors on the likelihood of the observed read alignments against the transcripts in one gene. The transcript abundances of all the genes are then jointly estimated with alternating optimization of multiple EM problems. In simulation Net-RSTQ effectively improved isoform transcript quantifications when isoform co-expressions correlate with their interactions. qRT-PCR results on 25 multi-isoform genes in a stem cell line, an ovarian cancer cell line, and a breast cancer cell line also showed that Net-RSTQ estimated more consistent isoform proportions with RNA-Seq data. In the experiments on the RNA-Seq data in The Cancer Genome Atlas (TCGA), the transcript abundances estimated by Net-RSTQ are more informative for patient sample classification of ovarian cancer, breast cancer and lung cancer. All experimental results collectively support that Net-RSTQ is a promising approach for isoform quantification. Net-RSTQ toolbox is available at http://compbio.cs.umn.edu/Net-RSTQ/. PMID:26699225

  13. In-Gel Stable-Isotope Labeling (ISIL): a strategy for mass spectrometry-based relative quantification.

    PubMed

    Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C

    2006-01-01

    Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.

  14. The use of self-quantification systems for personal health information: big data management activities and prospects

    PubMed Central

    2015-01-01

    Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions

  15. Antibody-free PRISM-SRM for multiplexed protein quantification: Is this the new competition for immunoassays in bioanalysis?

    SciTech Connect

    Shi, Tujin; Qian, Weijun

    2013-02-01

    Highly sensitive technologies for multiplexed quantification of a large number of candidate proteins will play an increasingly important role in clinical biomarker discovery, systems biology, and general biomedical research. Herein we introduce the new PRISM-SRM technology, which represents a highly sensitive multiplexed quantification technology capable of simultaneous quantification of many low-abundance proteins without the need of affinity reagents. The versatility of antibody-free PRISM-SRM for quantifying various types of targets including protein isoforms, protein modifications, metabolites, and others, thus offering new competition with immunoassays.

  16. High-Efficiency SPECT MPI: Comparison of Automated Quantification, Visual Interpretation, and Coronary Angiography

    PubMed Central

    Duvall, W. Lane; Slomka, Piotr J.; Gerlach, Jim R.; Sweeny, Joseph M.; Baber, Usman; Croft, Lori B.; Guma, Krista A.; George, Titus; Henzlova, Milena J.

    2013-01-01

    Background Recently introduced high-efficiency (HE) SPECT cameras with solid-state CZT detectors have been shown to decrease imaging time and reduce radiation exposure to patients. An automated, computer derived quantification of HE MPI has been shown to correlate well with coronary angiography on one HE SPECT camera system (D-SPECT), but has not been compared to visual interpretation on any of the HE SPECT platforms. Methods Patients undergoing a clinically indicated Tc-99m sestamibi HE SPECT (GE Discovery 530c with supine and prone imaging) study over a one year period followed by a coronary angiogram within 2 months were included. Only patients with a history of CABG surgery were excluded. Both MPI studies and coronary angiograms were reinterpreted by blinded readers. One hundred and twenty two very low (risk of CAD < 5%) or low (risk of CAD < 10%) likelihood subjects with normal myocardial perfusion were used to create normal reference limits. Computer derived quantification of the total perfusion deficit (TPD) at stress and rest was obtained with QPS software. The visual and automated MPI quantification were compared to coronary angiography (≥ 70% luminal stenosis) by receiver operating curve (ROC) analysis. Results Of the 3,111 patients who underwent HE SPECT over a one year period, 160 patients qualified for the correlation study (66% male, 52% with a history of CAD). The ROC area under the curve (AUC) was similar for both the automated and visual interpretations using both supine only and combined supine and prone images (0.69-0.74). Using thresholds determined from sensitivity and specificity curves, the automated reads showed higher specificity (59-67% versus 27-60%) and lower sensitivity (71-72% versus 79-93%) than the visual reads. By including prone images sensitivity decreased slightly but specificity increased for both. By excluding patients with known CAD and cardiomyopathies, AUC and specificity increased for both techniques (0.72-0.82). The use

  17. How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?

    NASA Astrophysics Data System (ADS)

    Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.

    2013-12-01

    In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial

  18. Black carbon quantification in charcoal-enriched soils by differential scanning calorimetry

    NASA Astrophysics Data System (ADS)

    Hardy, Brieuc; Cornelis, Jean-Thomas; Leifeld, Jens

    2015-04-01

    Black carbon (BC), the solid residue of the incomplete combustion of biomass and fossil fuels, is ubiquitous in soil and sediments, fulfilling several environmental services such as long-term carbon storage. BC is a particularly important terrestrial carbon pool due to its large residence time compared to thermally unaltered organic matter, which is largely attributed to its aromatic structure. However, BC refers to a wide range of pyrogenic products from partly charred biomass to highly condensed soot, with a degree of aromaticity and aromatic condensation varying to a large extend across the BC continuum. As a result, BC quantification largely depends on operational definitions, with the extraction efficiency of each method varying across the entire BC range. In our study, we investigated the adequacy of differential scanning calorimetry (DSC) for the quantification of BC in charcoal-enriched soils collected in the topsoil of pre-industrial charcoal kilns in forest and cropland of Wallonia, Belgium, where charcoal residues are mixed to uncharred soil organic matter (SOM). We compared the results to the fraction of the total organic carbon (TOC) resisting to K2Cr2O7 oxidation, another simple method often used for BC measurement. In our soils, DSC clearly discriminates SOM from chars. SOM is less thermally stable than charcoal and shows a peak maximum around 295°C. In forest and agricultural charcoal-enriched soils, three peaks were attributed to the thermal degradation of BC at 395, 458 and 523°C and 367, 420 and 502 °C, respectively. In cropland, the amount of BC calculated from the DSC peaks is closely related (slope of the linear regression = 0.985, R²=0.914) to the extra organic carbon content measured at charcoal kiln sites relative to the charcoal-unaffected adjacent soils, which is a positive indicator of the suitability of DSC for charcoal quantification in soil. The first BC peak, which may correspond to highly degraded charcoal, contributes to a

  19. Accurate episomal HIV 2-LTR circles quantification using optimized DNA isolation and droplet digital PCR

    PubMed Central

    Malatinkova, Eva; Kiselinova, Maja; Bonczkowski, Pawel; Trypsteen, Wim; Messiaen, Peter; Vermeire, Jolien; Verhasselt, Bruno; Vervisch, Karen; Vandekerckhove, Linos; De Spiegelaere, Ward

    2014-01-01

    Introduction In HIV-infected patients on combination antiretroviral therapy (cART), the detection of episomal HIV 2-LTR circles is a potential marker for ongoing viral replication. Quantification of 2-LTR circles is based on quantitative PCR or more recently on digital PCR assessment, but is hampered due to its low abundance. Sample pre-PCR processing is a critical step for 2-LTR circles quantification, which has not yet been sufficiently evaluated in patient derived samples. Materials and Methods We compared two sample processing procedures to more accurately quantify 2-LTR circles using droplet digital PCR (ddPCR). Episomal HIV 2-LTR circles were either isolated by genomic DNA isolation or by a modified plasmid DNA isolation, to separate the small episomal circular DNA from chromosomal DNA. This was performed in a dilution series of HIV-infected cells and HIV-1 infected patient derived samples (n=59). Samples for the plasmid DNA isolation method were spiked with an internal control plasmid. Results Genomic DNA isolation enables robust 2-LTR circles quantification. However, in the lower ranges of detection, PCR inhibition caused by high genomic DNA load substantially limits the amount of sample input and this impacts sensitivity and accuracy. Moreover, total genomic DNA isolation resulted in a lower recovery of 2-LTR templates per isolate, further reducing its sensitivity. The modified plasmid DNA isolation with a spiked reference for normalization was more accurate in these low ranges compared to genomic DNA isolation. A linear correlation of both methods was observed in the dilution series (R2=0.974) and in the patient derived samples with 2-LTR numbers above 10 copies per million peripheral blood mononuclear cells (PBMCs), (R2=0.671). Furthermore, Bland–Altman analysis revealed an average agreement between the methods within the 27 samples in which 2-LTR circles were detectable with both methods (bias: 0.3875±1.2657 log10). Conclusions 2-LTR circles

  20. In-situ quantification of solid oxide fuel cell electrode microstructure by electrochemical impedance spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhang, Yanxiang; Chen, Yu; Chen, Fanglin

    2015-03-01

    Three-dimensional (3D) microstructure of solid oxide fuel cell electrodes plays critical roles in determining fuel cell performance. The state-of-the-art quantification technique such as X-ray computed tomography enables direct calculation of geometric factors by 3D microstructure reconstruction. Taking advantages of in-situ, fast-responding and low cost, electrochemical impedance spectroscopy represented by distribution of relaxation time (DRT) is a novel technique to estimate geometric properties of fuel cell electrodes. In this study, we employed the anode supported cells with the cell configuration of Ni-YSZ || YSZ || LSM-YSZ as an example and compared the tortuosity factor of pores of the anode substrate layer by X-ray computed tomography and DRT analysis. Good agreement was found, validating the feasibility of in-situ microstructural quantification by using the DRT technique.

  1. Nanoparticle-based detection and quantification of DNA with single nucleotide polymorphism (SNP) discrimination selectivity

    PubMed Central

    Qin, Wei Jie; Yung, Lin Yue Lanry

    2007-01-01

    Sequence-specific DNA detection is important in various biomedical applications such as gene expression profiling, disease diagnosis and treatment, drug discovery and forensic analysis. Here we report a gold nanoparticle-based method that allows DNA detection and quantification and is capable of single nucleotide polymorphism (SNP) discrimination. The precise quantification of single-stranded DNA is due to the formation of defined nanoparticle-DNA conjugate groupings in the presence of target/linker DNA. Conjugate groupings were characterized and quantified by gel electrophoresis. A linear correlation between the amount of target DNA and conjugate groupings was found. For SNP detection, single base mismatch discrimination was achieved for both the end- and center-base mismatch. The method described here may be useful for the development of a simple and quantitative DNA detection assay. PMID:17720714

  2. Applications of acoustic radiation force impulse quantification in chronic kidney disease: a review.

    PubMed

    Wang, Liang

    2016-10-01

    Acoustic radiation force impulse (ARFI) imaging is an emerging technique with great promise in the field of elastography. Previous studies have validated ARFI quantification as a method of estimating fibrosis in chronic liver disease. Similarly, fibrosis is the principal process underlying the progression of chronic kidney disease, which is the major cause of renal failure. However, the quantification of tissue stiffness using ARFI imaging is more complex in the kidney than in the liver. Moreover, not all previous studies are comparable because they employed different procedures. Therefore, subsequent studies are warranted, both in animal models and in clinical patients, in order to better understand the histopathological mechanisms associated with renal elasticity and to further improve this imaging method by developing a standardized guidelines for its implementation.

  3. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    PubMed

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. PMID:25864956

  4. Ferromagnetic resonance for the quantification of superparamagnetic iron oxide nanoparticles in biological materials.

    PubMed

    Gamarra, Lionel F; daCosta-Filho, Antonio J; Mamani, Javier B; de Cassia Ruiz, Rita; Pavon, Lorena F; Sibov, Tatiana T; Vieira, Ernanni D; Silva, André C; Pontuschka, Walter M; Amaro, Edson

    2010-01-01

    The aim of the present work is the presentation of a quantification methodology for the control of the amount of superparamagnetic iron oxide nanoparticles (SPIONs) administered in biological materials by means of the ferromagnetic resonance technique (FMR) applied to studies both in vivo and in vitro. The in vivo study consisted in the analysis of the elimination and biodistribution kinetics of SPIONs after intravenous administration in Wistar rats. The results were corroborated by X-ray fluorescence. For the in vitro study, a quantitative analysis of the concentration of SPIONs bound to the specific AC133 monoclonal antibodies was carried out in order to detect the expression of the antigenic epitopes (CD133) in stem cells from human umbilical cord blood. In both studies FMR has proven to be an efficient technique for the SPIONs quantification per volume unit (in vivo) or per labeled cell (in vitro). PMID:20463936

  5. New class of radioenzymatic assay for the quantification of p-tyramine and phenylethylamine

    SciTech Connect

    Henry, D.P.; Van Huysse, J.W.; Bowsher, R.R.

    1986-03-01

    Radioenzymatic assays are widely used for the quantification of a number of biogenic amines. All previous procedures have utilized methyltransferases derived from mammalian tissues. In this assay for the quantification of the trace aralkylamines, p-tyramine (p-tym) and phenylethylamine (PEA), an enzyme, tyramine N-methyltransferase isolated from sprouted barley roots was used. The enzyme was specific for phenylethylamines. Of 26 structurally-related compounds, only p-tym, PEA, m-tym and amphetamine were substrates in vitro. Theoretic maximal methylation of substrates occurred at 10-20/sup 0/C. When TLC was used to separate the radiolabeled reaction products, a specific method was developed for p-tym and PEA. The assay had a sensitivity of 0.8 and 2.8 pg/tube with a C.V. < 5% and was applicable to human plasma and urine. Assay throughput is similar to that of other TLC based radioenzymatic assays.

  6. Biomarker Quantification in Unprocessed Human Sera by Using A New Nanomagnetic Protein Assay

    NASA Astrophysics Data System (ADS)

    Li, Yuanpeng; Jing, Ying; Srinivasan, Balasubramanian; Yao, Xiaofeng; Hugger, Marie A.; Xing, Chengguo; Wang, Jian-Ping

    2010-12-01

    Early recognition and prevention of chronic disease, such as lung cancer, require a fast, accurate detection and longitudinal monitoring on potential biomarkers, which could identify the molecule change in the initial stage of the chronic disease. Here we report the realization of specific and accurate quantification of a low-abundance serum protein in unprocessed human sera, using our novel giant magnetoresistive (GMR) biosensing system with uniform high-magnetic-moment nanoparticles and a competition based detection scheme. Only one antibody is needed for such detection scheme. The quantification of interleukin-6 (IL-6, a low-abundance protein and a potential cancer biomarker), as low as 125 fM IL-6 proteins, directly in 4 μL of unprocessed human sera was demonstrated within 5 minutes by such system. The results nicely differentiate normal individuals and lung cancer patients. This platform has great potential to facilitate the identification and validation of disease biomarkers.

  7. A simple SPR-based method for the quantification of the effect of potential virus inhibitors.

    PubMed

    Boltovets, Praskoviya M; Polischuk, Olena M; Kovalenko, Oleksiy G; Snopok, Boris A

    2013-01-21

    Here, we describe a highly sensitive method that allows for the correct quantification of inhibition effect with a higher degree of accuracy directly at the molecular level. The protocol involves two stages, namely serological virus titration in comparison with the same procedure for virus-effector mixture. Owing to the robustness of the analysis this assay can be performed on crude cellular and plant extracts, and therefore it may be suitable for the routine analysis of clinical samples, or in the field. The efficiency of the approach to the quantification of the inhibition effect of polysaccharide glucuronoxylomannan (GXM) on the infection efficiency of the tobacco mosaic virus (TMV) was investigated using advanced serological approaches based on label-free surface plasmon resonance technique. It was shown that GXM drastically decreases the efficiency of TMV infection by blocking up to 70% of the virus shell. The obtained results are in conformity with the method of indicator plant infection.

  8. Single point quantification of antibody by ELISA without need of a reference curve.

    PubMed

    Dopatka, H D; Giesendorf, B

    1992-01-01

    A new method for the quantification of antibodies in the enzyme-linked immunosorbent assay is described. This procedure replaces titer determinations based on end-point dilution of the antibody under investigation. Here, the sample is tested in a single dilution and the optical density (OD) obtained is used in the equation log10 titer = alpha.OD beta. The titer can then be calculated by inserting into the formula the values for the constants alpha and beta, which are specified by the manufacturer for each separate batch of kit reagents. This so-called alpha-method saves time and reagents while providing results which are equal to the titration method in accuracy and superior in precision. The alpha-method is also a simpler and reliable alternative to the use of standard or reference curves for the quantification of the antibodies in I.U./ml.

  9. Comparative assessment of DNA-based approaches for the quantification of food allergens.

    PubMed

    Luber, Florian; Demmel, Anja; Herbert, Denise; Hosken, Anne; Hupfer, Christine; Huber, Ingrid; Busch, Ulrich; Engel, Karl-Heinz

    2014-10-01

    Governments all over the world have implemented regulatory frameworks concerning food allergen labelling and established or discussed the implementation of thresholds. Therefore, quantitative methods are needed for their surveillance. DNA-based approaches using a matrix-adapted calibration, an internal standard material and a modified standard addition have been developed. In order to enable a comparative assessment of the available quantification methods, experimental framework conditions and uniform performance criteria were defined. For the evaluation of the experimental results using homogenous sample material, the recovery, repeatability and reproducibility were considered along with the limit of detection and the limit of quantification. In addition, muffin dough and muffins spiked with sesame were analysed to assess the suitability of the methods to quantify sesame in model foods. The modified standard addition emerged from the comparative assessment and the analysis of the model foods to be the most appropriate method to quantify traces of allergens in food.

  10. In Vitro/In Vivo Toxicity Evaluation and Quantification of Iron Oxide Nanoparticles

    PubMed Central

    Patil, Ujwal S.; Adireddy, Shiva; Jaiswal, Ashvin; Mandava, Sree; Lee, Benjamin R.; Chrisey, Douglas B.

    2015-01-01

    Increasing biomedical applications of iron oxide nanoparticles (IONPs) in academic and commercial settings have alarmed the scientific community about the safety and assessment of toxicity profiles of IONPs. The great amount of diversity found in the cytotoxic measurements of IONPs points toward the necessity of careful characterization and quantification of IONPs. The present document discusses the major developments related to in vitro and in vivo toxicity assessment of IONPs and its relationship with the physicochemical parameters of IONPs. Major discussion is included on the current spectrophotometric and imaging based techniques used for quantifying, and studying the clearance and biodistribution of IONPs. Several invasive and non-invasive quantification techniques along with the pitfalls are discussed in detail. Finally, critical guidelines are provided to optimize the design of IONPs to minimize the toxicity. PMID:26501258

  11. Applications of acoustic radiation force impulse quantification in chronic kidney disease: a review.

    PubMed

    Wang, Liang

    2016-10-01

    Acoustic radiation force impulse (ARFI) imaging is an emerging technique with great promise in the field of elastography. Previous studies have validated ARFI quantification as a method of estimating fibrosis in chronic liver disease. Similarly, fibrosis is the principal process underlying the progression of chronic kidney disease, which is the major cause of renal failure. However, the quantification of tissue stiffness using ARFI imaging is more complex in the kidney than in the liver. Moreover, not all previous studies are comparable because they employed different procedures. Therefore, subsequent studies are warranted, both in animal models and in clinical patients, in order to better understand the histopathological mechanisms associated with renal elasticity and to further improve this imaging method by developing a standardized guidelines for its implementation. PMID:27599890

  12. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  13. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    SciTech Connect

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification through PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.

  14. A probabilistic generative model for quantification of DNA modifications enables analysis of demethylation pathways.

    PubMed

    Äijö, Tarmo; Huang, Yun; Mannerström, Henrik; Chavez, Lukas; Tsagaratou, Ageliki; Rao, Anjana; Lähdesmäki, Harri

    2016-03-14

    We present a generative model, Lux, to quantify DNA methylation modifications from any combination of bisulfite sequencing approaches, including reduced, oxidative, TET-assisted, chemical-modification assisted, and methylase-assisted bisulfite sequencing data. Lux models all cytosine modifications (C, 5mC, 5hmC, 5fC, and 5caC) simultaneously together with experimental parameters, including bisulfite conversion and oxidation efficiencies, as well as various chemical labeling and protection steps. We show that Lux improves the quantification and comparison of cytosine modification levels and that Lux can process any oxidized methylcytosine sequencing data sets to quantify all cytosine modifications. Analysis of targeted data from Tet2-knockdown embryonic stem cells and T cells during development demonstrates DNA modification quantification at unprecedented detail, quantifies active demethylation pathways and reveals 5hmC localization in putative regulatory regions.

  15. Identification and quantification of constituents of Gardenia jasminoides Ellis (Zhizi) by HPLC-DAD-ESI-MS.

    PubMed

    Bergonzi, M C; Righeschi, C; Isacchi, B; Bilia, A R

    2012-09-15

    A simple, rapid and specific HPLC method was carried out for the analysis of characteristic constituents in Gardenia jasminoides Ellis (Zhizi), namely iridoids, caffeoyl quinic acid derivatives and crocins. The separation was successfully obtained using a C(18) column by gradient elution with mixtures of methanol and water as mobile phases; detection wavelength was set at 240 nm for iridoid glycosides, 315 nm for quinic acid derivatives and 438 nm for crocins. The analytical method was validated and the quantification of active compounds, namely iridoids, was performed. Linearity, precision, repeatability, stability, accuracy, limit of detection (LOD) and limit of quantification (LOQ) were also reported. This assay was successfully applied for qualitative and quantitative analysis of five commercial samples of G. jasminoides Ellis. PMID:23107748

  16. A targeted proteomics toolkit for high-throughput absolute quantification of Escherichia coli proteins.

    PubMed

    Batth, Tanveer S; Singh, Pragya; Ramakrishnan, Vikram R; Sousa, Mirta M L; Chan, Leanne Jade G; Tran, Huu M; Luning, Eric G; Pan, Eva H Y; Vuu, Khanh M; Keasling, Jay D; Adams, Paul D; Petzold, Christopher J

    2014-11-01

    Transformation of engineered Escherichia coli into a robust microbial factory is contingent on precise control of metabolism. Yet, the throughput of omics technologies used to characterize cell components has lagged far behind our ability to engineer novel strains. To expand the utility of quantitative proteomics for metabolic engineering, we validated and optimized targeted proteomics methods for over 400 proteins from more than 20 major pathways in E. coli metabolism. Complementing these methods, we constructed a series of synthetic genes to produce concatenated peptides (QconCAT) for absolute quantification of the proteins and made them available through the Addgene plasmid repository (www.addgene.org). To facilitate high sample throughput, we developed a fast, analytical-flow chromatography method using a 5.5-min gradient (10 min total run time). Overall this toolkit provides an invaluable resource for metabolic engineering by increasing sample throughput, minimizing development time and providing peptide standards for absolute quantification of E. coli proteins.

  17. Quantification and characterization of Si in Pinus Insignis Dougl by TXRF

    NASA Astrophysics Data System (ADS)

    Navarro, Henry; Bennun, Leonardo; Marcó, Lué M.

    2015-03-01

    A simple quantification of silicon is described, in woods such as Pinus Insigne Dougl obtained from the 8th region of Bío-Bío, 37°15″ South-73°19″ West, Chile. The samples were prepared through fractional calcination, and the ashes were directly analyzed by total reflection X-ray fluorescence (TXRF) technique. The analysis of 16 samples that were calcined is presented. The samples were weighed on plastic reflectors in a microbalance with sensitivity of 0.1 µg. Later, the samples were irradiated in a TXRF PICOFOX spectrometer, for 350 and 700 s. To each sample, cobalt was added as an internal standard. Concentrations of silicon over the 1 % in each sample and the self-absorption effect on the quantification were observed, in masses higher than 100 μg.

  18. Interference of salts used on aqueous two-phase systems on the quantification of total proteins.

    PubMed

    Golunski, Simone Maria; Sala, Luisa; Silva, Marceli Fernandes; Dallago, Rogério Marcos; Mulinari, Jéssica; Mossi, Altemir José; Brandelli, Adriano; Kalil, Susana Juliano; Di Luccio, Marco; Treichel, Helen

    2016-02-01

    In this study the interference of potassium phosphate, sodium citrate, sodium chloride and sodium nitrate salts on protein quantification by Bradford's method was assessed. Potassium phosphate and sodium citrate salts are commonly used in aqueous two-phase systems for enzyme purification. Results showed that the presence of potassium phosphate and sodium citrate salts increase the absorbance of the samples, when compared with the samples without any salt. The increase in absorptivity of the solution induces errors on protein quantification, which are propagated to the calculations of specific enzyme activity and consequently on purification factor. The presence of sodium chloride and sodium nitrate practically did not affect the absorbance of inulinase, probably the metals present in the enzyme extract did not interact with the added salts.

  19. Considerations for quantification of lipids in nerve tissue using MALDI mass spectrometric imaging

    PubMed Central

    Landgraf, Rachelle R.; Garrett, Timothy J.; Prieto Conaway, Maria C.; Calcutt, Nigel A.; Stacpoole, Peter W.; Yost, Richard A.

    2013-01-01

    MALDI mass spectrometric imaging is a technique that provides the ability to identify and characterize endogenous and exogenous compounds spatially within tissue with relatively little sample preparation. While it is a proven methodology for qualitative analysis, little has been reported for its utility in quantitative measurements. In the current work, inherent challenges in MALDI quantification are addressed. Signal response is monitored over successive analyses of a single tissue section to minimize error due to variability in the laser, matrix application, and sample inhomogeneity. Methods for the application of an internal standard to tissue sections are evaluated and used to quantify endogenous lipids in nerve tissue. A precision of 5% or less standard error was achieved, illustrating that MALDI imaging offers a reliable means of in situ quantification for microgram-sized samples and requires minimal sample preparation. PMID:21953974

  20. A method to quantify infectious airborne pathogens at concentrations below the threshold of quantification by culture

    PubMed Central

    Cutler, Timothy D.; Wang, Chong; Hoff, Steven J.; Zimmerman, Jeffrey J.

    2013-01-01

    In aerobiology, dose-response studies are used to estimate the risk of infection to a susceptible host presented by exposure to a specific dose of an airborne pathogen. In the research setting, host- and pathogen-specific factors that affect the dose-response continuum can be accounted for by experimental design, but the requirement to precisely determine the dose of infectious pathogen to which the host was exposed is often challenging. By definition, quantification of viable airborne pathogens is based on the culture of micro-organisms, but some airborne pathogens are transmissible at concentrations below the threshold of quantification by culture. In this paper we present an approach to the calculation of exposure dose at microbiologically unquantifiable levels using an application of the “continuous-stirred tank reactor (CSTR) model” and the validation of this approach using rhodamine B dye as a surrogate for aerosolized microbial pathogens in a dynamic aerosol toroid (DAT). PMID:24082399