Science.gov

Sample records for quantification spatialisation vulnerabilite

  1. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  2. Dystrophin quantification

    PubMed Central

    Anthony, Karen; Arechavala-Gomeza, Virginia; Taylor, Laura E.; Vulin, Adeline; Kaminoh, Yuuki; Torelli, Silvia; Feng, Lucy; Janghra, Narinder; Bonne, Gisèle; Beuvin, Maud; Barresi, Rita; Henderson, Matt; Laval, Steven; Lourbakos, Afrodite; Campion, Giles; Straub, Volker; Voit, Thomas; Sewry, Caroline A.; Morgan, Jennifer E.; Flanigan, Kevin M.

    2014-01-01

    Objective: We formed a multi-institution collaboration in order to compare dystrophin quantification methods, reach a consensus on the most reliable method, and report its biological significance in the context of clinical trials. Methods: Five laboratories with expertise in dystrophin quantification performed a data-driven comparative analysis of a single reference set of normal and dystrophinopathy muscle biopsies using quantitative immunohistochemistry and Western blotting. We developed standardized protocols and assessed inter- and intralaboratory variability over a wide range of dystrophin expression levels. Results: Results from the different laboratories were highly concordant with minimal inter- and intralaboratory variability, particularly with quantitative immunohistochemistry. There was a good level of agreement between data generated by immunohistochemistry and Western blotting, although immunohistochemistry was more sensitive. Furthermore, mean dystrophin levels determined by alternative quantitative immunohistochemistry methods were highly comparable. Conclusions: Considering the biological function of dystrophin at the sarcolemma, our data indicate that the combined use of quantitative immunohistochemistry and Western blotting are reliable biochemical outcome measures for Duchenne muscular dystrophy clinical trials, and that standardized protocols can be comparable between competent laboratories. The methodology validated in our study will facilitate the development of experimental therapies focused on dystrophin production and their regulatory approval. PMID:25355828

  3. Scoliosis quantification: an overview

    PubMed Central

    Kawchuk, Greg; McArthur, Ross

    1997-01-01

    Scoliotic curvatures have long been a focus of attention for clinicians and research scientists alike. The study, treatment and ultimately, the prevention of this prevalent health condition are impeded by the absence of an accurate, reliable, convenient and safe method of scoliosis quantification. The purpose of this paper is to provide an overview of the current methods of scoliosis quantification for clinicians who address this condition in their practices.

  4. Quantification of nonclassicality

    NASA Astrophysics Data System (ADS)

    Gehrke, C.; Sperling, J.; Vogel, W.

    2012-11-01

    To quantify single-mode nonclassicality, we start from an operational approach. A positive semidefinite observable is introduced to describe a measurement setup. The quantification is based on the negativity of the normally ordered version of this observable. Perfect operational quantumness corresponds to the quantum-noise-free measurement of the chosen observable. Surprisingly, even moderately squeezed states may exhibit perfect quantumness for a properly designed measurement. The quantification is also considered from an axiomatic viewpoint, based on the algebraic structure of the quantum states and the quantum superposition principle. Basic conclusions from both approaches are consistent with this fundamental principle of the quantum world.

  5. Quantificational logic of context

    SciTech Connect

    Buvac, Sasa

    1996-12-31

    In this paper we extend the Propositional Logic of Context, to the quantificational (predicate calculus) case. This extension is important in the declarative representation of knowledge for two reasons. Firstly, since contexts are objects in the semantics which can be denoted by terms in the language and which can be quantified over, the extension enables us to express arbitrary first-order properties of contexts. Secondly, since the extended language is no longer only propositional, we can express that an arbitrary predicate calculus formula is true in a context. The paper describes the syntax and the semantics of a quantificational language of context, gives a Hilbert style formal system, and outlines a proof of the system`s completeness.

  6. Wrappers, Aspects, Quantification and Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2005-01-01

    Talk overview: Object infrastructure framework (OIF). A system development to simplify building distributed applications by allowing independent implementation of multiple concern. Essence and state of AOP. Trinity. Quantification over events. Current work on a generalized AOP technology.

  7. Nitrogen quantification with SNMS

    NASA Astrophysics Data System (ADS)

    Goschnick, J.; Natzeck, C.; Sommer, M.

    1999-04-01

    Plasma-based secondary neutral mass spectrometry (plasma SNMS) is a powerful analytical method for determining the elemental concentrations of almost any kind of material at low cost by using a cheap quadrupole mass filter. However, a quadrupole-based mass spectrometer is limited to nominal mass resolution. Atomic signals are sometimes superimposed by molecular signals (2 or 3 atomic clusters such as CH +, CH 2+ or metal oxide clusters) and/or intensities of double-charged species. Especially in the case of nitrogen several interferences can impede the quantification. This article reports on methods to recognize and deconvolute superpositions of N + with CH 2+, Li 2+, and Si 2+ at mass 14 D (Debye) occurring during analysis of organic and inorganic substances. The recognition is based on the signal pattern of N +, Li +, CH +, and Si +. The latter serve as indicators for a probable interference of molecular or double-charged species with N on mass 14 D. The subsequent deconvolution use different shapes of atomic and cluster kinetic energy distributions (kEDs) to determine the quantities of the intensity components by a linear fit of N + and non-atomic kEDs obtained from several organic and inorganic standards into the measured kED. The atomic intensity fraction yields a much better nitrogen concentration than the total intensity of mass 14 D after correction.

  8. Quantification of human responses

    NASA Technical Reports Server (NTRS)

    Steinlage, R. C.; Gantner, T. E.; Lim, P. Y. W.

    1992-01-01

    Human perception is a complex phenomenon which is difficult to quantify with instruments. For this reason, large panels of people are often used to elicit and aggregate subjective judgments. Print quality, taste, smell, sound quality of a stereo system, softness, and grading Olympic divers and skaters are some examples of situations where subjective measurements or judgments are paramount. We usually express what is in our mind through language as a medium but languages are limited in available choices of vocabularies, and as a result, our verbalizations are only approximate expressions of what we really have in mind. For lack of better methods to quantify subjective judgments, it is customary to set up a numerical scale such as 1, 2, 3, 4, 5 or 1, 2, 3, ..., 9, 10 for characterizing human responses and subjective judgments with no valid justification except that these scales are easy to understand and convenient to use. But these numerical scales are arbitrary simplifications of the complex human mind; the human mind is not restricted to such simple numerical variations. In fact, human responses and subjective judgments are psychophysical phenomena that are fuzzy entities and therefore difficult to handle by conventional mathematics and probability theory. The fuzzy mathematical approach provides a more realistic insight into understanding and quantifying human responses. This paper presents a method for quantifying human responses and subjective judgments without assuming a pattern of linear or numerical variation for human responses. In particular, quantification and evaluation of linguistic judgments was investigated.

  9. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  10. Statistical Approach to Protein Quantification*

    PubMed Central

    Gerster, Sarah; Kwon, Taejoon; Ludwig, Christina; Matondo, Mariette; Vogel, Christine; Marcotte, Edward M.; Aebersold, Ruedi; Bühlmann, Peter

    2014-01-01

    A major goal in proteomics is the comprehensive and accurate description of a proteome. This task includes not only the identification of proteins in a sample, but also the accurate quantification of their abundance. Although mass spectrometry typically provides information on peptide identity and abundance in a sample, it does not directly measure the concentration of the corresponding proteins. Specifically, most mass-spectrometry-based approaches (e.g. shotgun proteomics or selected reaction monitoring) allow one to quantify peptides using chromatographic peak intensities or spectral counting information. Ultimately, based on these measurements, one wants to infer the concentrations of the corresponding proteins. Inferring properties of the proteins based on experimental peptide evidence is often a complex problem because of the ambiguity of peptide assignments and different chemical properties of the peptides that affect the observed concentrations. We present SCAMPI, a novel generic and statistically sound framework for computing protein abundance scores based on quantified peptides. In contrast to most previous approaches, our model explicitly includes information from shared peptides to improve protein quantitation, especially in eukaryotes with many homologous sequences. The model accounts for uncertainty in the input data, leading to statistical prediction intervals for the protein scores. Furthermore, peptides with extreme abundances can be reassessed and classified as either regular data points or actual outliers. We used the proposed model with several datasets and compared its performance to that of other, previously used approaches for protein quantification in bottom-up mass spectrometry. PMID:24255132

  11. Quantification of wastewater sludge dewatering.

    PubMed

    Skinner, Samuel J; Studer, Lindsay J; Dixon, David R; Hillis, Peter; Rees, Catherine A; Wall, Rachael C; Cavalida, Raul G; Usher, Shane P; Stickland, Anthony D; Scales, Peter J

    2015-10-01

    Quantification and comparison of the dewatering characteristics of fifteen sewage sludges from a range of digestion scenarios are described. The method proposed uses laboratory dewatering measurements and integrity analysis of the extracted material properties. These properties were used as inputs into a model of filtration, the output of which provides the dewatering comparison. This method is shown to be necessary for quantification and comparison of dewaterability as the permeability and compressibility of the sludges varies by up to ten orders of magnitude in the range of solids concentration of interest to industry. This causes a high sensitivity of the dewaterability comparison to the starting concentration of laboratory tests, thus simple dewaterability comparison based on parameters such as the specific resistance to filtration is difficult. The new approach is demonstrated to be robust relative to traditional methods such as specific resistance to filtration analysis and has an in-built integrity check. Comparison of the quantified dewaterability of the fifteen sludges to the relative volatile solids content showed a very strong correlation in the volatile solids range from 40 to 80%. The data indicate that the volatile solids parameter is a strong indicator of the dewatering behaviour of sewage sludges. PMID:26003332

  12. Detection and Quantification of Neurotransmitters in Dialysates

    PubMed Central

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2010-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection). PMID:19575473

  13. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  14. Protein inference: A protein quantification perspective.

    PubMed

    He, Zengyou; Huang, Ting; Liu, Xiaoqing; Zhu, Peijun; Teng, Ben; Deng, Shengchun

    2016-08-01

    In mass spectrometry-based shotgun proteomics, protein quantification and protein identification are two major computational problems. To quantify the protein abundance, a list of proteins must be firstly inferred from the raw data. Then the relative or absolute protein abundance is estimated with quantification methods, such as spectral counting. Until now, most researchers have been dealing with these two processes separately. In fact, the protein inference problem can be regarded as a special protein quantification problem in the sense that truly present proteins are those proteins whose abundance values are not zero. Some recent published papers have conceptually discussed this possibility. However, there is still a lack of rigorous experimental studies to test this hypothesis. In this paper, we investigate the feasibility of using protein quantification methods to solve the protein inference problem. Protein inference methods aim to determine whether each candidate protein is present in the sample or not. Protein quantification methods estimate the abundance value of each inferred protein. Naturally, the abundance value of an absent protein should be zero. Thus, we argue that the protein inference problem can be viewed as a special protein quantification problem in which one protein is considered to be present if its abundance is not zero. Based on this idea, our paper tries to use three simple protein quantification methods to solve the protein inference problem effectively. The experimental results on six data sets show that these three methods are competitive with previous protein inference algorithms. This demonstrates that it is plausible to model the protein inference problem as a special protein quantification task, which opens the door of devising more effective protein inference algorithms from a quantification perspective. The source codes of our methods are available at: http://code.google.com/p/protein-inference/. PMID:26935399

  15. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  16. MAMA Software Features: Visual Examples of Quantification

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-20

    This document shows examples of the results from quantifying objects of certain sizes and types in the software. It is intended to give users a better feel for some of the quantification calculations, and, more importantly, to help users understand the challenges with using a small set of ‘shape’ quantification calculations for objects that can vary widely in shapes and features. We will add more examples to this in the coming year.

  17. Uncertainty Quantification in Solidification Modelling

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2015-06-01

    Numerical models have been used to simulate solidification processes, to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to a few or single experiments, in which agreement is dependent on both model and experimental uncertainty. As a first step to quantifying the uncertainty in the models, sensitivity and uncertainty analysis were performed on a simple steady state 1D solidification model of continuous casting of weld filler rod. This model includes conduction, advection, and release of latent heat was developed for use in uncertainty quantification in the calculation of the position of the liquidus and solidus and the solidification time. Using this model, a Smolyak sparse grid algorithm constructed a response surface that fit model outputs based on the range of uncertainty in the inputs to the model. The response surface was then used to determine the probability density functions (PDF's) of the model outputs and sensitivities of the inputs. This process was done for a linear fraction solid and temperature relationship, for which there is an analytical solution, and a Scheil relationship. Similar analysis was also performed on a transient 2D model of solidification in a rectangular domain.

  18. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  19. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  20. Separation and quantification of microalgal carbohydrates.

    PubMed

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse. PMID:23177152

  1. Carotid intraplaque neovascularization quantification software (CINQS).

    PubMed

    Akkus, Zeynettin; van Burken, Gerard; van den Oord, Stijn C H; Schinkel, Arend F L; de Jong, Nico; van der Steen, Antonius F W; Bosch, Johan G

    2015-01-01

    Intraplaque neovascularization (IPN) is an important biomarker of atherosclerotic plaque vulnerability. As IPN can be detected by contrast enhanced ultrasound (CEUS), imaging-biomarkers derived from CEUS may allow early prediction of plaque vulnerability. To select the best quantitative imaging-biomarkers for prediction of plaque vulnerability, a systematic analysis of IPN with existing and new analysis algorithms is necessary. Currently available commercial contrast quantification tools are not applicable for quantitative analysis of carotid IPN due to substantial motion of the carotid artery, artifacts, and intermittent perfusion of plaques. We therefore developed a specialized software package called Carotid intraplaque neovascularization quantification software (CINQS). It was designed for effective and systematic comparison of sets of quantitative imaging biomarkers. CINQS includes several analysis algorithms for carotid IPN quantification and overcomes the limitations of current contrast quantification tools and existing carotid IPN quantification approaches. CINQS has a modular design which allows integrating new analysis tools. Wizard-like analysis tools and its graphical-user-interface facilitate its usage. In this paper, we describe the concept, analysis tools, and performance of CINQS and present analysis results of 45 plaques of 23 patients. The results in 45 plaques showed excellent agreement with visual IPN scores for two quantitative imaging-biomarkers (The area under the receiver operating characteristic curve was 0.92 and 0.93). PMID:25561454

  2. Quantification of sweat gland innervation

    PubMed Central

    Gibbons, Christopher H.; Illigens, Ben M. W.; Wang, Ningshan; Freeman, Roy

    2009-01-01

    Objective: To evaluate a novel method to quantify the density of nerve fibers innervating sweat glands in healthy control and diabetic subjects, to compare the results to an unbiased stereologic technique, and to identify the relationship to standardized physical examination and patient-reported symptom scores. Methods: Thirty diabetic and 64 healthy subjects had skin biopsies performed at the distal leg and distal and proximal thigh. Nerve fibers innervating sweat glands, stained with PGP 9.5, were imaged by light microscopy. Sweat gland nerve fiber density (SGNFD) was quantified by manual morphometry. As a gold standard, three additional subjects had biopsies analyzed by confocal microscopy using unbiased stereologic quantification. Severity of neuropathy was measured by standardized instruments including the Neuropathy Impairment Score in the Lower Limb (NIS-LL) while symptoms were measured by the Michigan Neuropathy Screening Instrument. Results: Manual morphometry increased with unbiased stereology (r = 0.93, p < 0.01). Diabetic subjects had reduced SGNFD compared to controls at the distal leg (p < 0.001), distal thigh (p < 0.01), and proximal thigh (p < 0.05). The SGNFD at the distal leg of diabetic subjects decreased as the NIS-LL worsened (r = −0.89, p < 0.001) and was concordant with symptoms of reduced sweat production (p < 0.01). Conclusions: We describe a novel method to quantify the density of nerve fibers innervating sweat glands. The technique differentiates groups of patients with mild diabetic neuropathy from healthy control subjects and correlates with both physical examination scores and symptoms relevant to sudomotor dysfunction. This method provides a reliable structural measure of sweat gland innervation that complements the investigation of small fiber neuropathies. GLOSSARY AOI = area of interest; CI = confidence interval; ICC = intraclass correlation coefficient; IENFD = intraepidermal nerve fiber density; IgG = immunoglobulin G; NIS

  3. Tumor Quantification in Clinical Positron Emission Tomography

    PubMed Central

    Bai, Bing; Bading, James; Conti, Peter S

    2013-01-01

    Positron emission tomography (PET) is used extensively in clinical oncology for tumor detection, staging and therapy response assessment. Quantitative measurements of tumor uptake, usually in the form of standardized uptake values (SUVs), have enhanced or replaced qualitative interpretation. In this paper we review the current status of tumor quantification methods and their applications to clinical oncology. Factors that impede quantitative assessment and limit its accuracy and reproducibility are summarized, with special emphasis on SUV analysis. We describe current efforts to improve the accuracy of tumor uptake measurements, characterize overall metabolic tumor burden and heterogeneity of tumor uptake, and account for the effects of image noise. We also summarize recent developments in PET instrumentation and image reconstruction and their impact on tumor quantification. Finally, we offer our assessment of the current development needs in PET tumor quantification, including practical techniques for fully quantitative, pharmacokinetic measurements. PMID:24312151

  4. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  5. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  6. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  7. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  8. Cues, quantification, and agreement in language comprehension.

    PubMed

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension. PMID:25987192

  9. Adaptive Fourier modeling for quantification of tremor.

    PubMed

    Riviere, C N; Reich, S G; Thakor, N V

    1997-06-01

    A new computational method for quantification of tremor, the weighted frequency Fourier linear combiner (WFLC), is presented. This technique rapidly determines the frequency and amplitude of tremor by adjusting its filter weights according to a gradient search method. It provides continual tracking of frequency and amplitude modulations over the course of a test. By quantifying time-varying characteristics, the WFLC assists in correctly interpreting the results of spectral analysis, particularly for recordings exhibiting multiple spectral peaks. It therefore supplements spectral analysis, providing a more accurate picture of tremor than spectral analysis alone. The method has been incorporated into a desktop tremor measurement system to provide clinically useful analysis of tremor recorded during handwriting and drawing using a digitizing tablet. Simulated data clearly demonstrate tracking of variations in frequency and amplitude. Clinical recordings then show specific examples of quantification of time-varying aspects of tremor. PMID:9210577

  10. Uncertainty quantification for porous media flows

    NASA Astrophysics Data System (ADS)

    Christie, Mike; Demyanov, Vasily; Erbas, Demet

    2006-09-01

    Uncertainty quantification is an increasingly important aspect of many areas of computational science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of oil and water through oil reservoirs is an example of a complex system where accuracy in prediction is needed primarily for financial reasons. Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks. This paper examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed data. Machine learning algorithms are used to speed up the identification of regions in parameter space where good matches to observed data can be found.

  11. Uncertainty quantification of effective nuclear interactions

    NASA Astrophysics Data System (ADS)

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    2016-03-01

    We give a brief review on the development of phenomenological NN interactions and the corresponding quantification of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean field calculations through the Skyrme parameters and effective field theory counterterms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different fitting strategies on the light of recent developments.

  12. Whitepaper on Uncertainty Quantification for MPACT

    SciTech Connect

    Williams, Mark L.

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  13. Multiplexed quantification for data-independent acquisition.

    PubMed

    Minogue, Catherine E; Hebert, Alexander S; Rensvold, Jarred W; Westphall, Michael S; Pagliarini, David J; Coon, Joshua J

    2015-03-01

    Data-independent acquisition (DIA) strategies provide a sensitive and reproducible alternative to data-dependent acquisition (DDA) methods for large-scale quantitative proteomic analyses. Unfortunately, DIA methods suffer from incompatibility with common multiplexed quantification methods, specifically stable isotope labeling approaches such as isobaric tags and stable isotope labeling of amino acids in cell culture (SILAC). Here we expand the use of neutron-encoded (NeuCode) SILAC to DIA applications (NeuCoDIA), producing a strategy that enables multiplexing within DIA scans without further convoluting the already complex MS(2) spectra. We demonstrate duplex NeuCoDIA analysis of both mixed-ratio (1:1 and 10:1) yeast and mouse embryo myogenesis proteomes. Analysis of the mixed-ratio yeast samples revealed the strong accuracy and precision of our NeuCoDIA method, both of which were comparable to our established MS(1)-based quantification approach. NeuCoDIA also uncovered the dynamic protein changes that occur during myogenic differentiation, demonstrating the feasibility of this methodology for biological applications. We consequently establish DIA quantification of NeuCode SILAC as a useful and practical alternative to DDA-based approaches. PMID:25621425

  14. Multiplexed Quantification for Data-Independent Acquisition

    PubMed Central

    Minogue, Catherine E.; Hebert, Alexander S.; Rensvold, Jarred W.; Westphall, Michael S.; Pagliarini, David J.; Coon, Joshua J.

    2015-01-01

    Data-independent acquisition (DIA) strategies provide a sensitive and reproducible alternative to data-dependent acquisition (DDA) methods for large-scale quantitative proteomic analyses. Unfortunately, DIA methods suffer from incompatibility with common multiplexed quantification methods, specifically stable isotope labeling approaches such as isobaric tags and stable isotope labeling of amino acids in cell culture (SILAC). Here we expand the use of neutron-encoded (NeuCode) SILAC to DIA applications (NeuCoDIA), producing a strategy that enables multiplexing within DIA scans without further convoluting the already complex MS2 spectra. We demonstrate duplex NeuCoDIA analysis of both mixed-ratio (1:1 and 10:1) yeast and mouse embryo myogenesis proteomes. Analysis of the mixed-ratio yeast samples revealed the strong accuracy and precision of our NeuCoDIA method, both of which were comparable to our established MS1-based quantification approach. NeuCoDIA also uncovered the dynamic protein changes that occur during myogenic differentiation, demonstrating the feasibility of this methodology for biological applications. We consequently establish DIA quantification of NeuCode SILAC as a useful and practical alternative to DDA-based approaches. PMID:25621425

  15. Fluorescence-linked Antigen Quantification (FLAQ) Assay for Fast Quantification of HIV-1 p24Gag

    PubMed Central

    Gesner, Marianne; Maiti, Mekhala; Grant, Robert; Cavrois, Marielle

    2016-01-01

    The fluorescence-linked antigen quantification (FLAQ) assay allows a fast quantification of HIV-1 p24Gag antigen. Viral supernatant are lysed and incubated with polystyrene microspheres coated with polyclonal antibodies against HIV-1 p24Gag and detector antibodies conjugated to fluorochromes (Figure 1). After washes, the fluorescence of microspheres is measured by flow cytometry and reflects the abundance of the antigen in the lysate. The speed, simplicity, and wide dynamic range of the FLAQ assay are optimum for many applications performed in HIV-1 research laboratories.

  16. QconCAT: Internal Standard for Protein Quantification.

    PubMed

    Scott, Kerry Bauer; Turko, Illarion V; Phinney, Karen W

    2016-01-01

    Protein quantification based on stable isotope labeling-mass spectrometry involves adding known quantities of stable isotope-labeled internal standards into biological samples. The internal standards are analogous to analyte molecules and quantification is achieved by comparing signals from isotope-labeled and analyte molecules. This methodology is broadly applicable to proteomics research, biomarker discovery and validation, and clinical studies, which require accurate and precise protein abundance measurements. One such internal standard platform for protein quantification is concatenated peptides (QconCAT). This chapter describes a protocol for the design, expression, characterization, and application of the QconCAT strategy for protein quantification. PMID:26791984

  17. Development of a VHH-Based Erythropoietin Quantification Assay.

    PubMed

    Kol, Stefan; Kallehauge, Thomas Beuchert; Adema, Simon; Hermans, Pim

    2015-08-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification of EPO in a high-throughput setting. PMID:25764454

  18. QUANTIFICATION OF TISSUE PROPERTIES IN SMALL VOLUMES

    SciTech Connect

    J. MOURANT; ET AL

    2000-12-01

    The quantification of tissue properties by optical measurements will facilitate the development of noninvasive methods of cancer diagnosis and detection. Optical measurements are sensitive to tissue structure which is known to change during tumorigenesis. The goals of the work presented in this paper were to verify that the primary scatterers of light in cells are structures much smaller than the nucleus and then to develop an optical technique that can quantify parameters of structures the same size as the scattering features in cells. Polarized, elastic back-scattering was found to be able to quantify changes in scattering properties for turbid media consisting of scatterers of the size found in tissue.

  19. Tutorial examples for uncertainty quantification methods.

    SciTech Connect

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  20. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  1. Concurrent quantification of multiple nanoparticle bound states

    PubMed Central

    Rauwerdink, Adam M.; Weaver, John B.

    2011-01-01

    Purpose: The binding of nanoparticles to in vivo targets impacts their use for medical imaging, therapy, and the study of diseases and disease biomarkers. Though an array of techniques can detect binding in vitro, the search for a robust in vivo method continues. The spectral response of magnetic nanoparticles can be influenced by a variety of changes in their physical environment including viscosity and binding. Here, the authors show that nanoparticles in these different environmental states produce spectral responses, which are sufficiently unique to allow for simultaneous quantification of the proportion of nanoparticles within each state. Methods: The authors measured the response to restricted Brownian motion using an array of magnetic nanoparticle designs. With a chosen optimal particle type, the authors prepared particle samples in three distinct environmental states. Various combinations of particles within these three states were measured concurrently and the authors attempted to solve for the quantity of particles within each physical state. Results: The authors found the spectral response of the nanoparticles to be sufficiently unique to allow for accurate quantification of up to three bound states with errors on the order of 1.5%. Furthermore, the authors discuss numerous paths for translating these measurements to in vivo applications. Conclusions: Multiple nanoparticle environmental states can be concurrently quantified using the spectral response of the particles. Such an ability, if translated to the in vivo realm, could provide valuable information about the fate of nanoparticles in vivo or improve the efficacy of nanoparticle based treatments. PMID:21520825

  2. Virus detection and quantification using electrical parameters

    PubMed Central

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-01-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles. PMID:25355078

  3. Quantification of ontogenetic allometry in ammonoids.

    PubMed

    Korn, Dieter

    2012-01-01

    Ammonoids are well-known objects used for studies on ontogeny and phylogeny, but a quantification of ontogenetic change has not yet been carried out. Their planispirally coiled conchs allow for a study of "longitudinal" ontogenetic data, that is data of ontogenetic trajectories that can be obtained from a single specimen. Therefore, they provide a good model for ontogenetic studies of geometry in other shelled organisms. Using modifications of three cardinal conch dimensions, computer simulations can model artificial conchs. The trajectories of ontogenetic allometry of these simulations can be analyzed in great detail in a theoretical morphospace. A method for the classification of conch ontogeny and quantification of the degree of allometry is proposed. Using high-precision cross-sections, the allometric conch growth of real ammonoids can be documented and compared. The members of the Ammonoidea show a wide variety of allometric growth, ranging from near isometry to monophasic, biphasic, or polyphasic allometry. Selected examples of Palaeozoic and Mesozoic ammonoids are shown with respect to their degree of change during ontogeny of the conch. PMID:23134208

  4. Quantification noise in single cell experiments

    PubMed Central

    Reiter, M.; Kirchner, B.; Müller, H.; Holzhauer, C.; Mann, W.; Pfaffl, M. W.

    2011-01-01

    In quantitative single-cell studies, the critical part is the low amount of nucleic acids present and the resulting experimental variations. In addition biological data obtained from heterogeneous tissue are not reflecting the expression behaviour of every single-cell. These variations can be derived from natural biological variance or can be introduced externally. Both have negative effects on the quantification result. The aim of this study is to make quantitative single-cell studies more transparent and reliable in order to fulfil the MIQE guidelines at the single-cell level. The technical variability introduced by RT, pre-amplification, evaporation, biological material and qPCR itself was evaluated by using RNA or DNA standards. Secondly, the biological expression variances of GAPDH, TNFα, IL-1β, TLR4 were measured by mRNA profiling experiment in single lymphocytes. The used quantification setup was sensitive enough to detect single standard copies and transcripts out of one solitary cell. Most variability was introduced by RT, followed by evaporation, and pre-amplification. The qPCR analysis and the biological matrix introduced only minor variability. Both conducted studies impressively demonstrate the heterogeneity of expression patterns in individual cells and showed clearly today's limitation in quantitative single-cell expression analysis. PMID:21745823

  5. Simple quantification of in planta fungal biomass.

    PubMed

    Ayliffe, Michael; Periyannan, Sambasivam K; Feechan, Angela; Dry, Ian; Schumann, Ulrike; Lagudah, Evans; Pryor, Anthony

    2014-01-01

    An accurate assessment of the disease resistance status of plants to fungal pathogens is an essential requirement for the development of resistant crop plants. Many disease resistance phenotypes are partial rather than obvious immunity and are frequently scored using subjective qualitative estimates of pathogen development or plant disease symptoms. Here we report a method for the accurate comparison of total fungal biomass in plant tissues. This method, called the WAC assay, is based upon the specific binding of the plant lectin wheat germ agglutinin to fungal chitin. The assay is simple, high-throughput, and sensitive enough to discriminate between single Puccinia graminis f.sp tritici infection sites on a wheat leaf segment. It greatly lends itself to replication as large volumes of tissue can be pooled from independent experiments and assayed to provide truly representative quantification, or, alternatively, fungal growth on a single, small leaf segment can be quantified. In addition, as the assay is based upon a microscopic technique, pathogen infection sites can also be examined at high magnification prior to quantification if desired and average infection site areas are determined. Previously, we have demonstrated the application of the WAC assay for quantifying the growth of several different pathogen species in both glasshouse grown material and large-scale field plots. Details of this method are provided within. PMID:24643560

  6. Centerline optimization using vessel quantification model

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Dachille, Frank; Meissner, Michael

    2005-04-01

    An accurate and reproducible centerline is needed in many vascular applications, such as virtual angioscopy, vessel quantification, and surgery planning. This paper presents a progressive optimization algorithm to refine a centerline after it is extracted. A new centerline model definition is proposed that allows quantifiable minimum cross-sectional area. A centerline is divided into a number of segments. Each segment corresponds to a local generalized cylinder. A reference frame (cross-section) is set up at the center point of each cylinder. The position and the orientation of the cross-section are optimized within each cylinder by finding the minimum cross-sectional area. All local-optimized center points are approximated by a NURBS curve globally, and the curve is re-sampled to the refined set of center points. This refinement iteration, local optimization plus global approximation, converges to the optimal centerline, yielding a smooth and accurate central axis curve. The application discussed in this paper is vessel quantification and virtual angioscopy. However, the algorithm is a general centerline refinement method that can be applied to other applications that need accurate and reproducible centerlines.

  7. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. PMID:26471520

  8. Survey and Evaluate Uncertainty Quantification Methodologies

    SciTech Connect

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  9. Quantification of adipose tissue insulin sensitivity.

    PubMed

    Søndergaard, Esben; Jensen, Michael D

    2016-06-01

    In metabolically healthy humans, adipose tissue is exquisitely sensitive to insulin. Similar to muscle and liver, adipose tissue lipolysis is insulin resistant in adults with central obesity and type 2 diabetes. Perhaps uniquely, however, insulin resistance in adipose tissue may directly contribute to development of insulin resistance in muscle and liver because of the increased delivery of free fatty acids to those tissues. It has been hypothesized that insulin adipose tissue resistance may precede other metabolic defects in obesity and type 2 diabetes. Therefore, precise and reproducible quantification of adipose tissue insulin sensitivity, in vivo, in humans, is an important measure. Unfortunately, no consensus exists on how to determine adipose tissue insulin sensitivity. We review the methods available to quantitate adipose tissue insulin sensitivity and will discuss their strengths and weaknesses. PMID:27073214

  10. Quantification of Glutathione in Caenorhabditis elegans

    PubMed Central

    Caito, Samuel W.; Aschner, Michael

    2015-01-01

    Glutathione (GSH) is the most abundant intracellular thiol with diverse functions from redox signaling, xenobiotic detoxification, and apoptosis. The quantification of GSH is an important measure for redox capacity and oxidative stress. This protocol quantifies total GSH from Caenorhabditis elegans, an emerging model organism for toxicology studies. GSH is measured using the 5,5′-dithiobis-(2-nitrobenzoic acid) (DTNB) cycling method originally created for cell and tissue samples but optimized for whole worm extracts. DTNB reacts with GSH to from a 5′-thio-2-nitrobenzoic acid (TNB) chromophore with maximum absorbance of 412 nm. This method is both rapid and sensitive, making it ideal for studies involving a large number of transgenic nematode strains. PMID:26309452

  11. Recurrence quantification analysis of global stock markets

    NASA Astrophysics Data System (ADS)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  12. Quantification of variability in trichome patterns

    PubMed Central

    Greese, Bettina; Hülskamp, Martin; Fleck, Christian

    2014-01-01

    While pattern formation is studied in various areas of biology, little is known about the noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to, e.g., the abundance of cell components or environmental conditions. To elevate the understanding of regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches toward characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability. PMID:25431575

  13. Feature isolation and quantification of evolving datasets

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Identifying and isolating features is an important part of visualization and a crucial step for the analysis and understanding of large time-dependent data sets (either from observation or simulation). In this proposal, we address these concerns, namely the investigation and implementation of basic 2D and 3D feature based methods to enhance current visualization techniques and provide the building blocks for automatic feature recognition, tracking, and correlation. These methods incorporate ideas from scientific visualization, computer vision, image processing, and mathematical morphology. Our focus is in the area of fluid dynamics, and we show the applicability of these methods to the quantification and tracking of three-dimensional vortex and turbulence bursts.

  14. Quantification of Osteon Morphology Using Geometric Histomorphometrics.

    PubMed

    Dillon, Scott; Cunningham, Craig; Felts, Paul

    2016-03-01

    Many histological methods in forensic anthropology utilize combinations of traditional histomorphometric parameters which may not accurately describe the morphology of microstructural features. Here, we report the novel application of a geometric morphometric method suitable when considering structures without anatomically homologous landmarks for the quantification of complete secondary osteon size and morphology. The method is tested for its suitability in the measurement of intact secondary osteons using osteons digitized from transverse femoral diaphyseal sections prepared from two human individuals. The results of methodological testing demonstrate the efficacy of the technique when applied to intact secondary osteons. In providing accurate characterization of micromorphology within the robust mathematical framework of geometric morphometrics, this method may surpass traditional histomorphometric variables currently employed in forensic research and practice. A preliminary study of the intersectional histomorphometric variation within the femoral diaphysis is made using this geometric histomorphometric method to demonstrate its potential. PMID:26478136

  15. Quantification of diacylglycerol by mass spectrometry.

    PubMed

    vom Dorp, Katharina; Dombrink, Isabel; Dörmann, Peter

    2013-01-01

    Diacylglycerol (DAG) is an important intermediate of lipid metabolism and a component of phospholipase C signal transduction. Quantification of DAG in plant membranes represents a challenging task because of its low abundance. DAG can be measured by direct infusion mass spectrometry (MS) on a quadrupole time-of-flight mass spectrometer after purification from the crude plant lipid extract via solid-phase extraction on silica columns. Different internal standards are employed to compensate for the dependence of the MS and MS/MS signals on the chain length and the presence of double bonds in the acyl moieties. Thus, using a combination of single MS and MS/MS experiments, quantitative results for the different molecular species of DAGs from Arabidopsis can be obtained. PMID:23681522

  16. Uncertainty quantification in DIC with Kriging regression

    NASA Astrophysics Data System (ADS)

    Wang, Dezhi; DiazDelaO, F. A.; Wang, Weizhuo; Lin, Xiaoshan; Patterson, Eann A.; Mottershead, John E.

    2016-03-01

    A Kriging regression model is developed as a post-processing technique for the treatment of measurement uncertainty in classical subset-based Digital Image Correlation (DIC). Regression is achieved by regularising the sample-point correlation matrix using a local, subset-based, assessment of the measurement error with assumed statistical normality and based on the Sum of Squared Differences (SSD) criterion. This leads to a Kriging-regression model in the form of a Gaussian process representing uncertainty on the Kriging estimate of the measured displacement field. The method is demonstrated using numerical and experimental examples. Kriging estimates of displacement fields are shown to be in excellent agreement with 'true' values for the numerical cases and in the experimental example uncertainty quantification is carried out using the Gaussian random process that forms part of the Kriging model. The root mean square error (RMSE) on the estimated displacements is produced and standard deviations on local strain estimates are determined.

  17. Carotenoid Extraction and Quantification from Capsicum annuum

    PubMed Central

    Richins, Richard D.; Kilcrease, James; Rodgriguez-Uribe, Laura; O'Connell, Mary A.

    2016-01-01

    Carotenoids are ubiquitous pigments that play key roles in photosynthesis and also accumulate to high levels in fruit and flowers. Specific carotenoids play essential roles in human health as these compounds are precursors for Vitamin A; other specific carotenoids are important sources of macular pigments and all carotenoids are important anti-oxidants. Accurate determination of the composition and concentration of this complex set of natural products is therefore important in many different scientific areas. One of the richest sources of these compounds is the fruit of Capsicum; these red, yellow and orange fruit accumulate multiple carotenes and xanthophylls. This report describes the detailed method for the extraction and quantification of specific carotenes and xanthophylls.

  18. Multispectral image analysis for algal biomass quantification.

    PubMed

    Murphy, Thomas E; Macon, Keith; Berberoglu, Halil

    2013-01-01

    This article reports a novel multispectral image processing technique for rapid, noninvasive quantification of biomass concentration in attached and suspended algae cultures. Monitoring the biomass concentration is critical for efficient production of biofuel feedstocks, food supplements, and bioactive chemicals. Particularly, noninvasive and rapid detection techniques can significantly aid in providing delay-free process control feedback in large-scale cultivation platforms. In this technique, three-band spectral images of Anabaena variabilis cultures were acquired and separated into their red, green, and blue components. A correlation between the magnitude of the green component and the areal biomass concentration was generated. The correlation predicted the biomass concentrations of independently prepared attached and suspended cultures with errors of 7 and 15%, respectively, and the effect of varying lighting conditions and background color were investigated. This method can provide necessary feedback for dilution and harvesting strategies to maximize photosynthetic conversion efficiency in large-scale operation. PMID:23554374

  19. Kinetic quantification of plyometric exercise intensity.

    PubMed

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program. PMID:22080319

  20. Quantification of heterogeneity observed in medical images

    PubMed Central

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. Methods In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. Results We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. Conclusions These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity. PMID:23453000

  1. Uncertainty Quantification of Equilibrium Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Brandon, S. T.; Covey, C. C.; Domyancic, D. M.; Johannesson, G.; Klein, R.; Tannahill, J.; Zhang, Y.

    2011-12-01

    Significant uncertainties exist in the temperature response of the climate system to changes in the levels of atmospheric carbon dioxide. We report progress to quantify the uncertainties of equilibrium climate sensitivity using perturbed parameter ensembles of the Community Earth System Model (CESM). Through a strategic initiative at the Lawrence Livermore National Laboratory, we have been developing uncertainty quantification (UQ) methods and incorporating them into a software framework called the UQ Pipeline. We have applied this framework to generate a large number of ensemble simulations using Latin Hypercube and other schemes to sample up to three dozen uncertain parameters in the atmospheric (CAM) and sea ice (CICE) model components of CESM. The parameters sampled are related to many highly uncertain processes, including deep and shallow convection, boundary layer turbulence, cloud optical and microphysical properties, and sea ice albedo. An extensive ensemble database comprised of more than 46,000 simulated climate-model-years of recent climate conditions has been assembled. This database is being used to train surrogate models of CESM responses and to perform statistical calibrations of the CAM and CICE models given observational data constraints. The calibrated models serve as a basis for propagating uncertainties forward through climate change simulations using a slab ocean model configuration of CESM. This procedure is being used to quantify the probability density function of equilibrium climate sensitivity accounting for uncertainties in climate model processes. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013. (LLNL-ABS-491765)

  2. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  3. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2008-09-01

    This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a “black box” approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. This approach requires large number of simulation runs and therefore has high computational cost. Contrary to the “black box” method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In this approach equations for the propagation of uncertainty are constructed and the sensitivity is solved for as variables in the same simulation. This “glass box” method can generate similar sensitivity information as the above “black box” approach with couples of runs to cover a large uncertainty region. Because only small numbers of runs are required, those runs can be done with a high accuracy in space and time ensuring that the uncertainty of the physical model is being measured and not simply the numerical error caused by the coarse discretization. In the forward sensitivity method, the model is differentiated with respect to each parameter to yield an additional system of the same size as the original one, the result of which is the solution sensitivity. The sensitivity of any output variable can then be directly obtained from these sensitivities by applying the chain rule of differentiation. We extend the forward sensitivity method to include time and spatial steps as special parameters so that the numerical errors can be quantified against other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty analysis. By knowing the relative sensitivity of time and space steps with other

  4. Quantification of ecotoxicological tests based on bioluminescence using Polaroid film.

    PubMed

    Tamminen, Manu V; Virta, Marko P J

    2007-01-01

    Assays based on the measurement of bacterial luminescence are widely used in ecotoxicology. Bacterial strains responding either to general toxicity or specific pollutants are rapid, cost-effective and easy to use. However, quantification of the signal requires relatively expensive instrumentation. We show here that the detection of luminescence of BioTox, a Vibrio fischeri-based toxicity test, and of a specific recombinant bacterial strain for arsenic determination, is possible using common Polaroid film. The exposed films can be used for visual or computer-assisted quantification of the signal. Qualitative visual comparison to standards can be used in the rapid and relatively accurate estimation of toxicity or pollutant concentration. The computer-assisted method significantly improves the accuracy and quantification of the results. The results obtained by computer-assisted quantification were in good agreement with the values obtained with a luminometer. PMID:16949132

  5. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  6. Experimental quantification of the tactile spatial responsivity of human cornea.

    PubMed

    Beiderman, Yevgeny; Belkin, Michael; Rotenstreich, Ygal; Zalevsky, Zeev

    2015-01-01

    We present the first experimental quantification of the tactile spatial responsivity of the cornea and we teach a subject to recognize spatial tactile shapes that are stimulated on their cornea. PMID:26158088

  7. Software-assisted serum metabolite quantification using NMR.

    PubMed

    Jung, Young-Sang; Hyeon, Jin-Seong; Hwang, Geum-Sook

    2016-08-31

    The goal of metabolomics is to analyze a whole metabolome under a given set of conditions, and accurate and reliable quantitation of metabolites is crucial. Absolute concentration is more valuable than relative concentration; however, the most commonly used method in NMR-based serum metabolic profiling, bin-based and full data point peak quantification, provides relative concentration levels of metabolites and are not reliable when metabolite peaks overlap in a spectrum. In this study, we present the software-assisted serum metabolite quantification (SASMeQ) method, which allows us to identify and quantify metabolites in NMR spectra using Chenomx software. This software uses the ERETIC2 utility from TopSpin to add a digitally synthesized peak to a spectrum. The SASMeQ method will advance NMR-based serum metabolic profiling by providing an accurate and reliable method for absolute quantification that is superior to bin-based quantification. PMID:27506360

  8. Neutron-encoded mass signatures for multiplexed proteome quantification.

    PubMed

    Hebert, Alexander S; Merrill, Anna E; Bailey, Derek J; Still, Amelia J; Westphall, Michael S; Strieter, Eric R; Pagliarini, David J; Coon, Joshua J

    2013-04-01

    We describe a protein quantification method called neutron encoding that exploits the subtle mass differences caused by nuclear binding energy variation in stable isotopes. These mass differences are synthetically encoded into amino acids and incorporated into yeast and mouse proteins via metabolic labeling. Mass spectrometry analysis with high mass resolution (>200,000) reveals the isotopologue-embedded peptide signals, permitting quantification. Neutron encoding will enable highly multiplexed proteome analysis with excellent dynamic range and accuracy. PMID:23435260

  9. Quality Quantification of Evaluated Cross Section Covariances

    SciTech Connect

    Varet, S.; Dossantos-Uzarralde, P.

    2015-01-15

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the {sup 85}Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations.

  10. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  11. Quantification of biological aging in young adults

    PubMed Central

    Belsky, Daniel W.; Caspi, Avshalom; Houts, Renate; Cohen, Harvey J.; Corcoran, David L.; Danese, Andrea; Harrington, HonaLee; Israel, Salomon; Levine, Morgan E.; Schaefer, Jonathan D.; Sugden, Karen; Williams, Ben; Yashin, Anatoli I.; Poulton, Richie; Moffitt, Terrie E.

    2015-01-01

    Antiaging therapies show promise in model organism research. Translation to humans is needed to address the challenges of an aging global population. Interventions to slow human aging will need to be applied to still-young individuals. However, most human aging research examines older adults, many with chronic disease. As a result, little is known about aging in young humans. We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their “biological aging” (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older. Measured biological aging in young adults can be used to identify causes of aging and evaluate rejuvenation therapies. PMID:26150497

  12. Shape regression for vertebra fracture quantification

    NASA Astrophysics Data System (ADS)

    Lund, Michael Tillge; de Bruijne, Marleen; Tanko, Laszlo B.; Nielsen, Mads

    2005-04-01

    Accurate and reliable identification and quantification of vertebral fractures constitute a challenge both in clinical trials and in diagnosis of osteoporosis. Various efforts have been made to develop reliable, objective, and reproducible methods for assessing vertebral fractures, but at present there is no consensus concerning a universally accepted diagnostic definition of vertebral fractures. In this project we want to investigate whether or not it is possible to accurately reconstruct the shape of a normal vertebra, using a neighbouring vertebra as prior information. The reconstructed shape can then be used to develop a novel vertebra fracture measure, by comparing the segmented vertebra shape with its reconstructed normal shape. The vertebrae in lateral x-rays of the lumbar spine were manually annotated by a medical expert. With this dataset we built a shape model, with equidistant point distribution between the four corner points. Based on the shape model, a multiple linear regression model of a normal vertebra shape was developed for each dataset using leave-one-out cross-validation. The reconstructed shape was calculated for each dataset using these regression models. The average prediction error for the annotated shape was on average 3%.

  13. Damage detection using multivariate recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Nichols, J. M.; Trickey, S. T.; Seaver, M.

    2006-02-01

    Recurrence-quantification analysis (RQA) has emerged as a useful tool for detecting subtle non-stationarities and/or changes in time-series data. Here, we extend the RQA analysis methods to multivariate observations and present a method by which the "length scale" parameter ɛ (the only parameter required for RQA) may be selected. We then apply the technique to the difficult engineering problem of damage detection. The structure considered is a finite element model of a rectangular steel plate where damage is represented as a cut in the plate, starting at one edge and extending from 0% to 25% of the plate width in 5% increments. Time series, recorded at nine separate locations on the structure, are used to reconstruct the phase space of the system's dynamics and subsequently generate the multivariate recurrence (and cross-recurrence) plots. Multivariate RQA is then used to detect damage-induced changes to the structural dynamics. These results are then compared with shifts in the plate's natural frequencies. Two of the RQA-based features are found to be more sensitive to damage than are the plate's frequencies.

  14. Quantification of contaminants associated with LDEF

    NASA Technical Reports Server (NTRS)

    Crutcher, E. R.; Nishimura, L. S.; Warner, K. J.; Wascher, W. W.

    1992-01-01

    The quantification of contaminants on the Long Duration Exposure Facility (LDEF) and associated hardware or tools is addressed. The purpose of this study was to provide a background data base for the evaluation of the surface of the LDEF and the effects of orbital exposure on that surface. This study necessarily discusses the change in the distribution of contaminants on the LDEF with time and environmental exposure. Much of this information may be of value for the improvement of contamination control procedures during ground based operations. The particulate data represents the results of NASA contractor monitoring as well as the results of samples collected and analyzed by the authors. The data from the tapelifts collected in the Space Shuttle Bay at Edwards Air Force Base and KSC are also presented. The amount of molecular film distributed over the surface of the LDEF is estimated based on measurements made at specific locations and extrapolated over the surface area of the LDEF. Some consideration of total amount of volatile-condensible materials available to form the resultant deposit is also presented. All assumptions underlying these estimates are presented along with the rationale for the conclusions. Each section is presented in a subsection for particles and another for molecular films.

  15. Uncertainty Quantification of Modelling of Equiaxed Solidification

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2016-07-01

    Numerical simulations of metal alloy solidification are used to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to sparse experimental data, to which agreement can be misinterpreted due to both model and experimental uncertainty. Uncertainty quantification (UQ) and sensitivity analysis are performed on a transient model of solidification of Al-4.5 wt.% Cu in a rectangular cavity, with equiaxed (grain refined) solidification morphology. This model solves equations for momentum, temperature, and species conservation; UQ and sensitivity analysis are performed for the degree of macrosegregation. A Smolyak sparse grid algorithm is used to select input values to construct a response surface fit to model outputs. The response surface is then used as a surrogate for the solidification model to determine the sensitivities and probability density functions of the model outputs. Uncertain model inputs of interest include the secondary dendrite arm spacing, equiaxed particle size, and fraction solid at which the rigid mushy zone forms. Similar analysis was also performed on a transient model of direct chill casting of the same alloy.

  16. Quantification of extracellular UDP-galactose

    PubMed Central

    Lazarowski, Eduardo R.

    2009-01-01

    The human P2Y14 receptor is potently activated by UDP-glucose (UDP-Glc), UDP-galactose (UDP-Gal), UDP-N-acetylglucosamine (UDP-GlcNAc), and UDP-glucuronic acid. Recently, cellular release of UDP-Glc and UDP-GlcNAc has been reported, but whether additional UDP-sugars are endogenous agonists for the P2Y14 receptor remains poorly defined. In the present study, we describe an assay for the quantification of UDP-Gal with sub-nanomolar sensitivity. This assay is based on the enzymatic conversion of UDP-Gal to UDP, using 1–4-β-galactosyltransferase. UDP is subsequently phosphorylated by nucleoside diphosphokinase in the presence of [γ32P]ATP and the formation of [γ32P]UTP is monitored by high performance liquid chromatography. The overall conversion of UDP-Gal to [γ32P]UTP was linear between 0.5 and 30 nM UDP-Gal. Extracellular UDP-Gal was detected on resting cultures of various cell types, and increased release of UDP-Gal was observed in 1321N1 human astrocytoma cells stimulated with the protease-activated receptor agonist thrombin. Occurrence of regulated release of UDP-Gal suggests that, in addition to its role in glycosylation reactions, UDP-Gal is an important extracellular signaling molecule. PMID:19699703

  17. Classification and quantification of leaf curvature

    PubMed Central

    Liu, Zhongyuan; Jia, Liguo; Mao, Yanfei; He, Yuke

    2010-01-01

    Various mutants of Arabidopsis thaliana deficient in polarity, cell division, and auxin response are characterized by certain types of leaf curvature. However, comparison of curvature for clarification of gene function can be difficult without a quantitative measurement of curvature. Here, a novel method for classification and quantification of leaf curvature is reported. Twenty-two mutant alleles from Arabidopsis mutants and transgenic lines deficient in leaf flatness were selected. The mutants were classified according to the direction, axis, position, and extent of leaf curvature. Based on a global measure of whole leaves and a local measure of four regions in the leaves, the curvature index (CI) was proposed to quantify the leaf curvature. The CI values accounted for the direction, axis, position, and extent of leaf curvature in all of the Arabidopsis mutants grown in growth chambers. Comparison of CI values between mutants reveals the spatial and temporal variations of leaf curvature, indicating the strength of the mutant alleles and the activities of the corresponding genes. Using the curvature indices, the extent of curvature in a complicated genetic background becomes quantitative and comparable, thus providing a useful tool for defining the genetic components of leaf development and to breed new varieties with leaf curvature desirable for the efficient capture of sunlight for photosynthesis and high yields. PMID:20400533

  18. Quantification of the vocal folds’ dynamic displacements

    NASA Astrophysics Data System (ADS)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  19. Uncertainty quantification for systems of conservation laws

    SciTech Connect

    Poette, Gael Despres, Bruno Lucor, Didier

    2009-04-20

    Uncertainty quantification through stochastic spectral methods has been recently applied to several kinds of non-linear stochastic PDEs. In this paper, we introduce a formalism based on kinetic theory to tackle uncertain hyperbolic systems of conservation laws with Polynomial Chaos (PC) methods. The idea is to introduce a new variable, the entropic variable, in bijection with our vector of unknowns, which we develop on the polynomial basis: by performing a Galerkin projection, we obtain a deterministic system of conservation laws. We state several properties of this deterministic system in the case of a general uncertain system of conservation laws. We then apply the method to the case of the inviscid Burgers' equation with random initial conditions and we present some preliminary results for the Euler system. We systematically compare results from our new approach to results from the stochastic Galerkin method. In the vicinity of discontinuities, the new method bounds the oscillations due to Gibbs phenomenon to a certain range through the entropy of the system without the use of any adaptative random space discretizations. It is found to be more precise than the stochastic Galerkin method for smooth cases but above all for discontinuous cases.

  20. Quantification of rigidity in Parkinson's disease.

    PubMed

    Sepehri, Behrooz; Esteki, Ali; Ebrahimi-Takamjani, Esmaeal; Shahidi, Golam-Ali; Khamseh, Fatemeh; Moinodin, Marzieh

    2007-12-01

    In this paper, a new method for quantification of rigidity in elbow joint of Parkinsonian patients is introduced. One of the most known syndromes in Parkinson's disease (PD) is increased passive stiffness in muscles, which leads to rigidity in joints. Clinical evaluation of stiffness in wrist and/or elbow, commonly used by clinicians, is based on Unified Parkinson's Disease Rating System (UPDRS). Subjective nature of this method may influence the accuracy and precision of evaluations. Hence, introducing an objective standard method based on quantitative measurements may be helpful. A test rig was designed and fabricated to measure range of motion and viscous and elastic components of passive stiffness in elbow joint. Measurements were done for 41 patients and 11 controls. Measures were extracted using Matlab-R14 software and statistic analyses were done by Spss-13. Relation between each computed measure and the level of illness were analyzed. Results showed a better correlation between viscous component of stiffness and UPDRS score compared to the elastic component. Results of this research may help to introduce a standard objective method for evaluation of PD. PMID:17909970

  1. Quantification of moving target cyber defenses

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; Cybenko, George

    2015-05-01

    Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.

  2. Benchmarking RNA-Seq quantification tools

    PubMed Central

    Chandramohan, R.; Wu, Po-Yen; Phan, J.H.; Wang, M.D.

    2016-01-01

    RNA-Seq, a deep sequencing technique, promises to be a potential successor to microarraysfor studying the transcriptome. One of many aspects of transcriptomics that are of interest to researchers is gene expression estimation. With rapid development in RNA-Seq, there are numerous tools available to estimate gene expression, each producing different results. However, we do not know which of these tools produces the most accurate gene expression estimates. In this study we have addressed this issue using Cufflinks, IsoEM, HTSeq, and RSEM to quantify RNA-Seq expression profiles. Comparing results of these quantification tools, we observe that RNA-Seq relative expression estimates correlate with RT-qPCR measurements in the range of 0.85 to 0.89, with HTSeq exhibiting the highest correlation. But, in terms of root-mean-square deviation of RNA-Seq relative expression estimates from RT-qPCR measurements, we find HTSeq to produce the greatest deviation. Therefore, we conclude that, though Cufflinks, RSEM, and IsoEM might not correlate as well as HTSeq with RT-qPCR measurements, they may produce expression values with higher accuracy. PMID:24109770

  3. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  4. Isolation, quantification, and analysis of chloroplast DNA.

    PubMed

    Rowan, Beth A; Bendich, Arnold J

    2011-01-01

    Many areas of chloroplast research require methods that can assess the quality and quantity of chloroplast DNA (cpDNA). The study of chloroplast functions that depend on the proper maintenance and expression of the chloroplast genome, understanding cpDNA replication and repair, and the development of technologies for chloroplast transformation are just some of the disciplines that require the isolation of high-quality cpDNA. Arabidopsis thaliana offers several advantages for studying these processes because of the sizeable collection of mutants and natural varieties (accessions) available from stock centers and a broad community of researchers that has developed many other genetic resources. Several approaches for the isolation and quantification of cpDNA have been developed, but little consideration has been given to the strengths and weaknesses and the type of information obtained by each method, especially with respect to A. thaliana. Here, we provide protocols for obtaining high-quality cpDNA for PCR and other applications, and we evaluate several different isolation and analytical methods in order to build a robust framework for the study of cpDNA with this model organism. PMID:21822838

  5. Uncertainty quantification in reacting flow modeling.

    SciTech Connect

    Le MaÒitre, Olivier P.; Reagan, Matthew T.; Knio, Omar M.; Ghanem, Roger Georges; Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  6. Legionella spp. isolation and quantification from greywater

    PubMed Central

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  7. Quantification of periodic breathing in premature infants

    PubMed Central

    Mohr, Mary A.; Fairchild, Karen D.; Patel, Manisha; Sinkin, Robert A.; Clark, Matthew T.; Moorman, J. Randall; Lake, Douglas E.; Kattwinkel, John; Delos, John B.

    2015-01-01

    Background Periodic breathing (PB), regular cycles of short apneic pauses and breaths, is common in newborn infants. To characterize normal and potentially pathologic PB, we used our automated apnea detection system and developed a novel method for quantifying PB. We identified a preterm infant who died of SIDS and who, on review of her breathing pattern while in the NICU, had exaggerated PB. Methods We analyzed the chest impedance signal for short apneic pauses and developed a wavelet transform method to identify repetitive 10–40 second cycles of apnea/breathing. Clinical validation was performed to distinguish PB from apnea clusters and determine the wavelet coefficient cutoff having optimum diagnostic utility. We applied this method to analyze the chest impedance signals throughout the entire NICU stays of all 70 infants born at 32 weeks’ gestation admitted over a two-and-a-half year period. This group includes an infant who died of SIDS and her twin. Results For infants of 32 weeks’ gestation, the fraction of time spent in PB peaks 7–14 days after birth at 6.5%. During that time the infant that died of SIDS spent 40% of each day in PB and her twin spent 15% of each day in PB. Conclusions This wavelet transform method allows quantification of normal and potentially pathologic PB in NICU patients. PMID:26012526

  8. Broadband acoustic quantification of stratified turbulence.

    PubMed

    Lavery, Andone C; Geyer, W Rockwell; Scully, Malcolm E

    2013-07-01

    High-frequency broadband acoustic scattering techniques have enabled the remote, high-resolution imaging and quantification of highly salt-stratified turbulence in an estuary. Turbulent salinity spectra in the stratified shear layer have been measured acoustically and by in situ turbulence sensors. The acoustic frequencies used span 120-600 kHz, which, for the highly stratified and dynamic estuarine environment, correspond to wavenumbers in the viscous-convective subrange (500-2500 m(-1)). The acoustically measured spectral levels are in close agreement with spectral levels measured with closely co-located micro-conductivity probes. The acoustically measured spectral shapes allow discrimination between scattering dominated by turbulent salinity microstructure and suspended sediments or swim-bladdered fish, the two primary sources of scattering observed in the estuary in addition to turbulent salinity microstructure. The direct comparison of salinity spectra inferred acoustically and by the in situ turbulence sensors provides a test of both the acoustic scattering model and the quantitative skill of acoustical remote sensing of turbulence dissipation in a strongly sheared and salt-stratified estuary. PMID:23862783

  9. Legionella spp. isolation and quantification from greywater.

    PubMed

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  10. Designing a simple physically-based bucket SVAT model for spatialisation of water needs

    NASA Astrophysics Data System (ADS)

    Lakhal, A.; Boulet, G.; Lakhal, L.; Er-Raki, S.; Duchemin, B.; Chehbouni, G.; Timouk, F.

    2003-04-01

    Within the frame of both IRRIMED and SUDMED projects one needs a robust and simple tool to provide space-time estimates of the water requirements in flat semi-arid agricultural zones. This is the task of the simplest water balance equations, which can be seen as simple SVAT schemes. Most of the simplest SVAT schemes use the classical bucket representation of soil moisture exchange through the soil-canopy-air continuum. They usually rely on empirical relationships such as the “beta function” that are not well suited for all climate, soil and vegetation conditions. Some of them for instance greatly simplify the deep drainage parameterization, or overlook the first to second stage evaporation processes. Several authors have proposed physically-based simple expressions, such as the desorptive approach, which gives accurate integrated capillary flows under constant boundary conditions. We propose here a simple SVAT schemes that uses the same approach but reduces as much as possible the number of empirical relationships. It is tested against 1) a physically based complex SVAT scheme SiSPAT and 2) experimental data acquired during the SALSA and the SUDMED field experiments in Mexico and Morocco (respectively) for a large range of vegetation types (olive trees, wheat crop, grassland). This simple SVAT is well suited to simulate long time series of soil moisture evolution, and proves to give accurate predictions of first to second-stage evaporation time series for the bare soil and fully vegetated cover conditions. An insight into model adjustment for sparse vegetation (which usually prevails under semi-arid conditions) is proposed and partially evaluated against SiSPAT outputs.

  11. Spatialised fate factors for nitrate in catchments: modelling approach and implication for LCA results.

    PubMed

    Basset-Mens, Claudine; Anibar, Lamiaa; Durand, Patrick; van der Werf, Hayo M G

    2006-08-15

    The challenge for environmental assessment tools, such as Life Cycle Assessment (LCA) is to provide a holistic picture of the environmental impacts of a given system, while being relevant both at a global scale, i.e., for global impact categories such as climate change, and at a smaller scale, i.e., for regional impact categories such as aquatic eutrophication. To this end, the environmental mechanisms between emission and impact should be taken into account. For eutrophication in particular, which is one of the main impacts of farming systems, the fate factor of eutrophying pollutants in catchments, and particularly of nitrate, reflects one of these important and complex environmental mechanisms. We define this fate factor as: the ratio of the amount of nitrate at the outlet of the catchment over the nitrate emitted from the catchment's soils. In LCA, this fate factor is most often assumed equal to 1, while the observed fate factor is generally less than 1. A generic approach for estimating the range of variation of nitrate fate factors in a region of intensive agriculture was proposed. This approach was based on the analysis of different catchment scenarios combining different catchment types and different effective rainfalls. The evolution over time of the nitrate fate factor as well as the steady state fate factor for each catchment scenario was obtained using the INCA simulation model. In line with the general LCA model, the implications of the steady state fate factors for nitrate were investigated for the eutrophication impact result in the framework of an LCA of pig production. A sensitivity analysis to the fraction of nitrate lost as N(2)O was presented for the climate change impact category. This study highlighted the difference between the observed fate factor at a given time, which aggregates both storage and transformation processes and a "steady state fate factor", specific to the system considered. The range of steady state fate factors obtained for the study region was wide, from 0.44 to 0.86, depending primarily on the catchment type and secondarily on the effective rainfall. The sensitivity of the LCA of pig production to the fate factors was significant concerning eutrophication, but potentially much larger concerning climate change. The potential for producing improved eutrophication results by using spatially differentiated fate factors was demonstrated. Additionally, the urgent need for quantitative studies on the N(2)O/N(2) ratio in riparian zones denitrification was highlighted. PMID:16488466

  12. Erratum: Erratum zu: Integration der bodenkundlichen Filter- und Pufferfunktion in die hydrogeologische Vulnerabilitätsbewertung

    NASA Astrophysics Data System (ADS)

    Wirsing, Tobias; Neukum, Christoph; Goldscheider, Nico; Maier, Matthias

    2015-09-01

    Vulnerability maps are standard tools for the assessment of groundwater sensitivity to contamination. Due to their increased use in technical guidelines, vulnerability maps have become state-of-the-art tools in resource management. However, most approaches have been developed by hydrogeologists and soil scientists who incorporate the understanding of processes from their specific disciplines very well but have limitations in considering processes in other disciplines. A soil-specific database for vulnerability assessment has been significantly improved by soil scientists over the past several years and includes quality, spatial extension and availability. Hence, it is time to integrate this database into hydrogeological concepts. This work presents a vulnerability mapping approach that considers a new soil database that has been available since 2014 for the entire Baden-Württemberg region at a scale of 1:50.000, adapting the well-established GLA and PI methods. Due to the newly-developed classification scheme for the protective function, this approach provides a more balanced and meaningful classification. This leads to a distinct image of the study area and a better interpretation of vulnerability.

  13. Eine Übersicht zu Methoden und Anwendungen der Validierung von Vulnerabilitätsbewertungen

    NASA Astrophysics Data System (ADS)

    Neukum, Christoph

    2013-03-01

    Groundwater vulnerability maps have been applied over the past several decades for assessing groundwater sensitivity to pollution. Many different methods with various approaches and associated information content have been developed over the years. However, application of different methods to the same areas may lead to different or even contradictive results that may render vulnerability mapping unreliable. This manuscript presents a selection of methods that have been applied to validate vulnerability mapping approaches with different boundary conditions at various scales. The validation approaches are explained and their advantages and disadvantages are discussed. A key result is that validation is an important part of vulnerability mapping and that it contributes to a sound interpretation.

  14. Quantification of isotopic turnover in agricultural systems

    NASA Astrophysics Data System (ADS)

    Braun, A.; Auerswald, K.; Schnyder, H.

    2012-04-01

    The isotopic turnover, which is a proxy for the metabolic rate, is gaining scientific importance. It is quantified for an increasing range of organisms, from microorganisms over plants to animals including agricultural livestock. Additionally, the isotopic turnover is analyzed on different scales, from organs to organisms to ecosystems and even to the biosphere. In particular, the quantification of the isotopic turnover of specific tissues within the same organism, e.g. organs like liver and muscle and products like milk and faeces, has brought new insights to improve understanding of nutrient cycles and fluxes, respectively. Thus, the knowledge of isotopic turnover is important in many areas, including physiology, e.g. milk synthesis, ecology, e.g. soil retention time of water, and medical science, e.g. cancer diagnosis. So far, the isotopic turnover is quantified by applying time, cost and expertise intensive tracer experiments. Usually, this comprises two isotopic equilibration periods. A first equilibration period with a constant isotopic input signal is followed by a second equilibration period with a distinct constant isotopic input signal. This yields a smooth signal change from the first to the second signal in the object under consideration. This approach reveals at least three major problems. (i) The input signals must be controlled isotopically, which is almost impossible in many realistic cases like free ranging animals. (ii) Both equilibration periods may be very long, especially when the turnover rate of the object under consideration is very slow, which aggravates the first problem. (iii) The detection of small or slow pools is improved by large isotopic signal changes, but large isotopic changes also involve a considerable change in the input material; e.g. animal studies are usually carried out as diet-switch experiments, where the diet is switched between C3 and C4 plants, since C3 and C4 plants differ strongly in their isotopic signal. The

  15. GPU-accelerated voxelwise hepatic perfusion quantification.

    PubMed

    Wang, H; Cao, Y

    2012-09-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10(-6). The method will be useful for generating liver perfusion images in clinical settings. PMID:22892645

  16. Quantification of asphaltene precipitation by scaling equation

    NASA Astrophysics Data System (ADS)

    Janier, Josefina Barnachea; Jalil, Mohamad Afzal B. Abd.; Samin, Mohamad Izhar B. Mohd; Karim, Samsul Ariffin B. A.

    2015-02-01

    Asphaltene precipitation from crude oil is one of the issues for the oil industry. The deposition of asphaltene occurs during production, transportation and separating process. The injection of carbon dioxide (CO2) during enhance oil recovery (EOR) is believed to contribute much to the precipitation of asphaltene. Precipitation can be affected by the changes in temperature and pressure on the crude oil however, reduction in pressure contribute much to the instability of asphaltene as compared to temperature. This paper discussed the quantification of precipitated asphaltene in crude oil at different high pressures and at constant temperature. The derived scaling equation was based on the reservoir condition with variation in the amount of carbon dioxide (CO2) mixed with Dulang a light crude oil sample used in the experiment towards the stability of asphaltene. A FluidEval PVT cell with Solid Detection System (SDS) was the instrument used to gain experimental knowledge on the behavior of fluid at reservoir conditions. Two conditions were followed in the conduct of the experiment. Firstly, a 45cc light crude oil was mixed with 18cc (40%) of CO2 and secondly, the same amount of crude oil sample was mixed with 27cc (60%) of CO2. Results showed that for a 45cc crude oil sample combined with 18cc (40%) of CO2 gas indicated a saturation pressure of 1498.37psi and asphaltene onset point was 1620psi. Then for the same amount of crude oil combined with 27cc (60%) of CO2, the saturation pressure was 2046.502psi and asphaltene onset point was 2230psi. The derivation of the scaling equation considered reservoir temperature, pressure, bubble point pressure, mole percent of the precipitant the injected gas CO2, and the gas molecular weight. The scaled equation resulted to a third order polynomial that can be used to quantify the amount of asphaltene in crude oil.

  17. Quantification of nitrotyrosine in nitrated proteins

    PubMed Central

    Zhang, Yingyi; Pöschl, Ulrich

    2010-01-01

    For kinetic studies of protein nitration reactions, we have developed a method for the quantification of nitrotyrosine residues in protein molecules by liquid chromatography coupled to a diode array detector of ultraviolet-visible absorption. Nitrated bovine serum albumin (BSA) and nitrated ovalbumin (OVA) were synthesized and used as standards for the determination of the protein nitration degree (ND), which is defined as the average number of nitrotyrosine residues divided by the total number of tyrosine residues in a protein molecule. The obtained calibration curves of the ratio of chromatographic peak areas of absorbance at 357 and at 280 nm vs. nitration degree are nearly the same for BSA and OVA (relative deviations <5%). They are near-linear at low ND (< 0.1) and can be described by a second-order polynomial fit up to \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$ {\\hbox{ND}} = 0.5\\left( {{R^2} > 0.99} \\right) $$\\end{document}. A change of chromatographic column led to changes in absolute peak areas but not in the peak area ratios and related calibration functions, which confirms the robustness of the analytical method. First results of laboratory experiments confirm that the method is applicable for the investigation of the reaction kinetics of protein nitration. The main advantage over alternative methods is that nitration degrees can be efficiently determined without hydrolysis or digestion of the investigated protein molecules. PMID:20300739

  18. Extended quantification of the generalized recurrence plot

    NASA Astrophysics Data System (ADS)

    Riedl, Maik; Marwan, Norbert; Kurths, Jürgen

    2016-04-01

    The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing structures, turbulent spatial plankton patterns, and fractals. But, it is also successfully applied to the description of spatio-temporal dynamics and the detection of regime shifts, such as in the complex Ginzburg-Landau- equation. The recurrence plot based determinism is a central measure in this framework quantifying the level of regularities in temporal and spatial structures. We extend this measure for the generalized recurrence plot considering additional operations of symmetry than the simple translation. It is tested not only on two-dimensional regular patterns and noise but also on complex spatial patterns reconstructing the parameter space of the complex Ginzburg-Landau-equation. The extended version of the determinism resulted in values which are consistent to the original recurrence plot approach. Furthermore, the proposed method allows a split of the determinism into parts which based on laminar and non-laminar regions of the two-dimensional pattern of the complex Ginzburg-Landau-equation. A comparison of these parts with a standard method of image classification, the co-occurrence matrix approach, shows differences especially in the description of patterns associated with turbulence. In that case, it seems that the extended version of the determinism allows a distinction of phase turbulence and defect turbulence by means of their spatial patterns. This ability of the proposed method promise new insights in other systems with turbulent dynamics coming from climatology, biology, ecology, and social sciences, for example.

  19. Rapid digital quantification of microfracture populations

    NASA Astrophysics Data System (ADS)

    Gomez, Leonel A.; Laubach, Stephen E.

    2006-03-01

    Populations of microfractures are a structural fabric in many rocks deformed at upper crustal conditions. In some cases these fractures are visible in transmitted-light microscopy as fluid-inclusion planes or cement filled microfractures, but because SEM-based cathodoluminescence (CL) reveals more fractures and delineates their shapes, sizes, and crosscutting relations, it is a more effective structural tool. Yet at magnifications of 150-300×, at which many microfractures are visible, SEM-CL detectors image only small sample areas (0.5-0.1 mm 2) relative to fracture population patterns. The substantial effort required to image and measure centimeter-size areas at high-magnification has impeded quantitative study of microfractures. We present a method for efficient collection of mosaics of high-resolution CL imagery, a preparation method that allows samples to be any size while retaining continuous imagery of rock (no gaps), and software that facilitates fracture mapping and data reduction. Although the method introduced here was developed for CL imagery, it can be used with any other kind of images, including mosaics from petrographic microscopes. Compared with manual measurements, the new method increases several fold the number of microfractures imaged without a proportional increase in level of effort, increases the accuracy and repeatability of fracture measurements, and speeds quantification and display of fracture population attributes. We illustrate the method on microfracture arrays in dolostone from NE Mexico and sandstone from NW Scotland. We show that key aspects of microfracture population attributes are only fully manifest at scales larger than a single thin section.

  20. Detection and Quantification of Citrullinated Chemokines

    PubMed Central

    Moelants, Eva A. V.; Van Damme, Jo; Proost, Paul

    2011-01-01

    Background Posttranslational deimination or citrullination by peptidylarginine deiminases (PAD) regulates the biological function of proteins and may be involved in the development of autoimmune diseases such as rheumatoid arthritis and multiple sclerosis. This posttranslational modification of arginine was recently discovered on inflammatory chemokines including CXCL8 and CXCL10, and significantly reduced their biological activity. To evaluate the importance of these modified chemokines in patients, methods for the detection and quantification of citrullinated chemokines are needed. Since citrullination only results in an increase of the protein mass with one mass unit and the loss of one positive charge, selective biochemical detection is difficult. Therefore, we developed an antibody-based method to specifically detect and quantify citrullination on a protein of interest. Methodology/Principal Findings First, the citrullinated proteins were chemically modified with antipyrine and 2,3-butanedione at low pH. Such selectively modified citrullines were subsequently detected and quantified by specific antibodies raised against a modified citrulline-containing peptide. The specificity of this two-step procedure was validated for citrullinated CXCL8 ([Cit5]CXCL8). Specific detection of [Cit5]CXCL8 concentrations between 1 and 50 ng/ml was possible, also in complex samples containing an excess of contaminating proteins. This novel detection method was used to evaluate the effect of lipopolysaccharide (LPS) on the citrullination of inflammatory chemokines induced in peripheral blood mononuclear cells (PBMCs) and granulocytes. LPS had no significant effect on the induction of CXCL8 citrullination in human PBMCs and granulocytes. However, granulocytes, known to contain PAD, were essential for the production of significant amounts of [Cit5]CXCL8. Conclusion/Significance The newly developed antibody-based method to specifically detect and quantify chemically modified

  1. Statistical Quantification of Methylation Levels by Next-Generation Sequencing

    PubMed Central

    Wu, Guodong; Yi, Nengjun; Absher, Devin; Zhi, Degui

    2011-01-01

    Background/Aims Recently, next-generation sequencing-based technologies have enabled DNA methylation profiling at high resolution and low cost. Methyl-Seq and Reduced Representation Bisulfite Sequencing (RRBS) are two such technologies that interrogate methylation levels at CpG sites throughout the entire human genome. With rapid reduction of sequencing costs, these technologies will enable epigenotyping of large cohorts for phenotypic association studies. Existing quantification methods for sequencing-based methylation profiling are simplistic and do not deal with the noise due to the random sampling nature of sequencing and various experimental artifacts. Therefore, there is a need to investigate the statistical issues related to the quantification of methylation levels for these emerging technologies, with the goal of developing an accurate quantification method. Methods In this paper, we propose two methods for Methyl-Seq quantification. The first method, the Maximum Likelihood estimate, is both conceptually intuitive and computationally simple. However, this estimate is biased at extreme methylation levels and does not provide variance estimation. The second method, based on Bayesian hierarchical model, allows variance estimation of methylation levels, and provides a flexible framework to adjust technical bias in the sequencing process. Results We compare the previously proposed binary method, the Maximum Likelihood (ML) method, and the Bayesian method. In both simulation and real data analysis of Methyl-Seq data, the Bayesian method offers the most accurate quantification. The ML method is slightly less accurate than the Bayesian method. But both our proposed methods outperform the original binary method in Methyl-Seq. In addition, we applied these quantification methods to simulation data and show that, with sequencing depth above 40–300 (which varies with different tissue samples) per cleavage site, Methyl-Seq offers a comparable quantification

  2. Quantification of chemical gaseous plumes on hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Niu, Sidi

    The passive remote chemical plume quantification problem may be approached from multiple aspects, corresponding to a variety of physical effects that may be exploited. Accordingly, a diversity of statistical quantification algorithms has been proposed in the literature. The ultimate performance and algorithmic complexity of each is influenced by the assumptions made about the scene, which may include the presence of ancillary measurements or particular background/plume features that may or may not be present. In this work, we evaluate and investigate the advantages and limitations of a number of quantification algorithms that span a variety of such assumptions. With these in-depth insights we gain, a new quantification algorithm is proposed for single gas quantification which is superior to all state-of-the-art algorithms in every almost every aspects including applicability, accuracy, and efficiency. The new method, called selected-band algorithm, achieves its superior performance through an accurate estimation of the unobservable off-plume radiance. The reason why off-plume radiance is recoverable relies on a common observation that most chemical gases only exhibit strong absorptive behavior in certain spectral bands. Those spectral bands where the gas absorption is almost zero or small are ideal to carry out background estimation. In this thesis, the new selected-band algorithm is first derived from its favorable narrow-band sharp-featured gas and then extended to an iterative algorithm that suits all kinds of gases. The performance improvement is verified by simulated data for a variety of experimental settings.

  3. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques. PMID:25182968

  4. Absolute and relative quantification of RNA modifications via biosynthetic isotopomers

    PubMed Central

    Kellner, Stefanie; Ochel, Antonia; Thüring, Kathrin; Spenkuch, Felix; Neumann, Jennifer; Sharma, Sunny; Entian, Karl-Dieter; Schneider, Dirk; Helm, Mark

    2014-01-01

    In the resurging field of RNA modifications, quantification is a bottleneck blocking many exciting avenues. With currently over 150 known nucleoside alterations, detection and quantification methods must encompass multiple modifications for a comprehensive profile. LC–MS/MS approaches offer a perspective for comprehensive parallel quantification of all the various modifications found in total RNA of a given organism. By feeding 13C-glucose as sole carbon source, we have generated a stable isotope-labeled internal standard (SIL-IS) for bacterial RNA, which facilitates relative comparison of all modifications. While conventional SIL-IS approaches require the chemical synthesis of single modifications in weighable quantities, this SIL-IS consists of a nucleoside mixture covering all detectable RNA modifications of Escherichia coli, yet in small and initially unknown quantities. For absolute in addition to relative quantification, those quantities were determined by a combination of external calibration and sample spiking of the biosynthetic SIL-IS. For each nucleoside, we thus obtained a very robust relative response factor, which permits direct conversion of the MS signal to absolute amounts of substance. The application of the validated SIL-IS allowed highly precise quantification with standard deviations <2% during a 12-week period, and a linear dynamic range that was extended by two orders of magnitude. PMID:25129236

  5. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  6. Methane bubbling: from speculation to quantification

    NASA Astrophysics Data System (ADS)

    Grinham, A. R.; Dunbabin, M.; Yuan, Z.

    2013-12-01

    magnitude from 500 to 100 000 mg m-2 d-1 depending on time of day and water depth. Average storage bubble flux rates between reservoirs varied by two orders of magnitude from 1 200 to 15 000 mg m-2 d-1, with the primary driver likely to be catchment forest cover. The relative contribution of bubbling to total fluxes varied from 10% to more than 90% depending on the reservoir and time of sampling. This method was consistently shown to greatly improve the spatial mapping and quantification of methane bubbling rates from reservoir surfaces and reduces the uncertainty associated with the determining the relative contribution of bubbling to total flux.

  7. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an

  8. Next generation of food allergen quantification using mass spectrometric systems.

    PubMed

    Koeberl, Martina; Clarke, Dean; Lopata, Andreas L

    2014-08-01

    Food allergies are increasing worldwide and becoming a public health concern. Food legislation requires detailed declarations of potential allergens in food products and therefore an increased capability to analyze for the presence of food allergens. Currently, antibody-based methods are mainly utilized to quantify allergens; however, these methods have several disadvantages. Recently, mass spectrometry (MS) techniques have been developed and applied to food allergen analysis. At present, 46 allergens from 11 different food sources have been characterized using different MS approaches and some specific signature peptides have been published. However, quantification of allergens using MS is not routinely employed. This review compares the different aspects of food allergen quantification using advanced MS techniques including multiple reaction monitoring. The latter provides low limits of quantification for multiple allergens in simple or complex food matrices, while being robust and reproducible. This review provides an overview of current approaches to analyze food allergens, with specific focus on MS systems and applications. PMID:24824675

  9. Isobaric Labeling-Based Relative Quantification in Shotgun Proteomics

    PubMed Central

    2015-01-01

    Mass spectrometry plays a key role in relative quantitative comparisons of proteins in order to understand their functional role in biological systems upon perturbation. In this review, we review studies that examine different aspects of isobaric labeling-based relative quantification for shotgun proteomic analysis. In particular, we focus on different types of isobaric reagents and their reaction chemistry (e.g., amine-, carbonyl-, and sulfhydryl-reactive). Various factors, such as ratio compression, reporter ion dynamic range, and others, cause an underestimation of changes in relative abundance of proteins across samples, undermining the ability of the isobaric labeling approach to be truly quantitative. These factors that affect quantification and the suggested combinations of experimental design and optimal data acquisition methods to increase the precision and accuracy of the measurements will be discussed. Finally, the extended application of isobaric labeling-based approach in hyperplexing strategy, targeted quantification, and phosphopeptide analysis are also examined. PMID:25337643

  10. Symmetry quantification and mapping using convergent beam electron diffraction.

    PubMed

    Kim, Kyou-Hyun; Zuo, Jian-Min

    2013-01-01

    We propose a new algorithm to quantify symmetry recorded in convergent beam electron diffraction (CBED) patterns and use it for symmetry mapping in materials applications. We evaluate the effectiveness of the profile R-factor (R(p)) and the normalized cross-correlation coefficient (γ) for quantifying the amount of symmetry in a CBED pattern. The symmetry quantification procedures are automated and the algorithm is implemented as a DM (Digital Micrograph(©)) script. Experimental and simulated CBED patterns recorded from a Si single crystal are used to calibrate the proposed algorithm for the symmetry quantification. The proposed algorithm is then applied to a Si sample with defects to test the sensitivity of symmetry quantification to defects. Using the mirror symmetry as an example, we demonstrate that the normalized cross-correlation coefficient provides an effective and robust measurement of the symmetry recorded in experimental CBED patterns. PMID:23142747

  11. Superlattice band structure: New and simple energy quantification condition

    NASA Astrophysics Data System (ADS)

    Maiz, F.

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga0.5Al0.5As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  12. Clinical PET Myocardial Perfusion Imaging and Flow Quantification.

    PubMed

    Juneau, Daniel; Erthal, Fernanda; Ohira, Hiroshi; Mc Ardle, Brian; Hessian, Renée; deKemp, Robert A; Beanlands, Rob S B

    2016-02-01

    Cardiac PET imaging is a powerful tool for the assessment of coronary artery disease. Many tracers with different advantages and disadvantages are available. It has several advantages over single photon emission computed tomography, including superior accuracy and lower radiation exposure. It provides powerful prognostic information, which can help to stratify patients and guide clinicians. The addition of flow quantification enables better detection of multivessel disease while providing incremental prognostic information. Flow quantification provides important physiologic information, which may be useful to individualize patient therapy. This approach is being applied in some centers, but requires standardization before it is more widely applied. PMID:26590781

  13. Detection and quantification of chimerism by droplet digital PCR.

    PubMed

    George, David; Czech, Juliann; John, Bobby; Yu, Min; Jennings, Lawrence J

    2013-01-01

    Accurate quantification of chimerism and microchimerism is proving to be increasingly valuable for hematopoietic cell transplantation as well as non-transplant conditions. However, methods that are available to quantify low-level chimerism lack accuracy. Therefore, we developed and validated a method for quantifying chimerism based on digital PCR technology. We demonstrate accurate quantification that far exceeds what is possible with analog qPCR down to 0.01% with the potential to go even lower. Also, this method is inherently more informative than qPCR. We expect the advantages of digital PCR will make it the preferred method for chimerism analysis. PMID:23974275

  14. Quantification of Cellular Proliferation in Mouse Atherosclerotic Lesions.

    PubMed

    Fuster, José J

    2015-01-01

    Excessive cell proliferation within atherosclerotic plaques plays an important role in the progression of atherosclerosis. Macrophage proliferation in particular has become a major focus of attention in the cardiovascular field because it appears to mediate most of macrophage expansion in mouse atherosclerotic arteries. Therefore, quantification of cell proliferation is an essential part of the characterization of atherosclerotic plaques in experimental studies. This chapter describes two variants of a simple immunostaining protocol that allow for the quantification of cellular proliferation in mouse atherosclerotic lesions based on the detection of the proliferation-associated antigen Ki-67. PMID:26445791

  15. A quick colorimetric method for total lipid quantification in microalgae.

    PubMed

    Byreddy, Avinesh R; Gupta, Adarsha; Barrow, Colin J; Puri, Munish

    2016-06-01

    Discovering microalgae with high lipid productivity are among the key milestones for achieving sustainable biodiesel production. Current methods of lipid quantification are time intensive and costly. A rapid colorimetric method based on sulfo-phospho-vanillin (SPV) reaction was developed for the quantification of microbial lipids to facilitate screening for lipid producing microalgae. This method was successfully tested on marine thraustochytrid strains and vegetable oils. The colorimetric method results correlated well with gravimetric method estimates. The new method was less time consuming than gravimetric analysis and is quantitative for lipid determination, even in the presence of carbohydrates, proteins and glycerol. PMID:27050419

  16. Quantification of toxicological effects for dichloromethane. Draft report (Final)

    SciTech Connect

    Not Available

    1990-04-01

    The source documents for background information used to develop the report on the quantification of toxicological effects for dichloromethane are the health assessment document (HAD) for dichloromethane and a subsequent addendum to the HAD (U.S. EPA, 1985b). In addition, some references published since 1985 are discussed. To summarize the results of the quantification of toxicological effects, a One-day Health Advisory of 10,000 ug/L for a 10-kg child was calculated, based on an acute oral study in rats reported by Kimura et al. (1971). No suitable data for the derivation of a Ten-day Health Advisory were found in the available literature.

  17. Brief review of uncertainty quantification for particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Farias, M. H.; Teixeira, R. S.; Koiller, J.; Santos, A. M.

    2016-07-01

    Metrological studies for particle image velocimetry (PIV) are recent in literature. An attempt to evaluate the uncertainty quantifications (UQ) of the PIV velocity field are in evidence. Therefore, a short review on main sources of uncertainty in PIV and available methodologies for its quantification are presented. In addition, the potential of some mathematical techniques, coming from the area of geometric mechanics and control, that could interest the fluids UQ community are highlighted as good possibilities. “We must measure what is measurable and make measurable what cannot be measured” (Galileo)

  18. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. PMID:23796718

  19. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  20. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  1. Reliability quantification and visualization for electric microgrids

    NASA Astrophysics Data System (ADS)

    Panwar, Mayank

    and parallel with the area Electric Power Systems (EPS), (3) includes the local EPS and may include portions of the area EPS, and (4) is intentionally planned. A more reliable electric power grid requires microgrids to operate in tandem with the EPS. The reliability can be quantified through various metrics for performance measure. This is done through North American Electric Reliability Corporation (NERC) metrics in North America. The microgrid differs significantly from the traditional EPS, especially at asset level due to heterogeneity in assets. Thus, the performance cannot be quantified by the same metrics as used for EPS. Some of the NERC metrics are calculated and interpreted in this work to quantify performance for a single asset and group of assets in a microgrid. Two more metrics are introduced for system level performance quantification. The next step is a better representation of the large amount of data generated by the microgrid. Visualization is one such form of representation which is explored in detail and a graphical user interface (GUI) is developed as a deliverable tool to the operator for informative decision making and planning. Electronic appendices-I and II contain data and MATLAB© program codes for analysis and visualization for this work.

  2. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Injury assessment-quantification. 990.52 Section 990.52 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT REGULATIONS NATURAL RESOURCE DAMAGE...

  3. Quantification of Wheat Grain Arabinoxylans Using a Phloroglucinol Colorimetric Assay

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Arabinoxylans (AX) play a critical role in end-use quality and nutrition of wheat (Triticum aestivum L.). An efficient, accurate method of AX quantification is desirable as AX plays an important role in processing, end use quality and human health. The objective of this work was to evaluate a stand...

  4. Comparison of DNA Quantification Methods for Next Generation Sequencing

    PubMed Central

    Robin, Jérôme D.; Ludlow, Andrew T.; LaRanger, Ryan; Wright, Woodring E.; Shay, Jerry W.

    2016-01-01

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library’s heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality. PMID:27048884

  5. DeMix-Q: Quantification-Centered Data Processing Workflow.

    PubMed

    Zhang, Bo; Käll, Lukas; Zubarev, Roman A

    2016-04-01

    For historical reasons, most proteomics workflows focus on MS/MS identification but consider quantification as the end point of a comparative study. The stochastic data-dependent MS/MS acquisition (DDA) gives low reproducibility of peptide identifications from one run to another, which inevitably results in problems with missing values when quantifying the same peptide across a series of label-free experiments. However, the signal from the molecular ion is almost always present among the MS(1)spectra. Contrary to what is frequently claimed, missing values do not have to be an intrinsic problem of DDA approaches that perform quantification at the MS(1)level. The challenge is to perform sound peptide identity propagation across multiple high-resolution LC-MS/MS experiments, from runs with MS/MS-based identifications to runs where such information is absent. Here, we present a new analytical workflow DeMix-Q (https://github.com/userbz/DeMix-Q), which performs such propagation that recovers missing values reliably by using a novel scoring scheme for quality control. Compared with traditional workflows for DDA as well as previous DIA studies, DeMix-Q achieves deeper proteome coverage, fewer missing values, and lower quantification variance on a benchmark dataset. This quantification-centered workflow also enables flexible and robust proteome characterization based on covariation of peptide abundances. PMID:26729709

  6. Quantification and Single-Spore Detection of Phakopsora pachyrhizi

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The microscopic identification and quantification of Phakopsora pachyrhizi spores from environmental samples, spore traps, and laboratory specimens can represent a challenge. Such reports, especially from passive spore traps, commonly describe the number of “rust-like” spores; for other forensic sa...

  7. Colorimetric Quantification and in Situ Detection of Collagen

    ERIC Educational Resources Information Center

    Esteban, Francisco J.; del Moral, Maria L.; Sanchez-Lopez, Ana M.; Blanco, Santos; Jimenez, Ana; Hernandez, Raquel; Pedrosa, Juan A.; Peinado, Maria A.

    2005-01-01

    A simple multidisciplinary and inexpensive laboratory exercise is proposed, in which the undergraduate student may correlate biochemical and anatomical findings. The entire practical session can be completed in one 2.5-3 hour laboratory period, and consists of the quantification of collagen and total protein content from tissue sections--without…

  8. Identification and quantification of methanogenic archaea in adult chicken ceca

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methanogens, members of the domain Archaea, have been isolated from various animals but few reports exists regarding the isolation of methanogens from chicken, goose, and turkey feces. By using molecular methods for the identification and quantification of methanogenic archea in adult chicken ceca,...

  9. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  10. Current Issues in the Quantification of Federal Reserved Water Rights

    NASA Astrophysics Data System (ADS)

    Brookshire, David S.; Watts, Gary L.; Merrill, James L.

    1985-11-01

    This paper examines the quantification of federal reserved water rights from legal, institutional, and economic perspectives. Special attention is directed toward Indian reserved water rights and the concept of practicably irrigable acreage. We conclude by examining current trends and exploring alternative approaches to the dilemma of quantifying Indian reserved water rights.

  11. Infectious Viral Quantification of Chikungunya Virus-Virus Plaque Assay.

    PubMed

    Kaur, Parveen; Lee, Regina Ching Hua; Chu, Justin Jang Hann

    2016-01-01

    The plaque assay is an essential method for quantification of infectious virus titer. Cells infected with virus particles are overlaid with a viscous substrate. A suitable incubation period results in the formation of plaques, which can be fixed and stained for visualization. Here, we describe a method for measuring Chikungunya virus (CHIKV) titers via virus plaque assays. PMID:27233264

  12. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  13. Macroscopic inspection of ape feces: what's in a quantification method?

    PubMed

    Phillips, Caroline A; McGrew, William C

    2014-06-01

    Macroscopic inspection of feces has been used to investigate primate diet. The limitations of this method to identify food-items to species level have long been recognized, but ascertaining aspects of diet (e.g., folivory) are achievable by quantifying food-items in feces. Quantification methods applied include rating food-items using a scale of abundance, estimating their percentage volume, and weighing food-items. However, verification as to whether or not composition data differ, depending on which quantification method is used during macroscopic inspection, has not been done. We analyzed feces collected from ten adult chimpanzees (Pan troglodytes schweinfurthii) of the Kanyawara community in Kibale National Park, Uganda. We compare dietary composition totals obtained from using different quantification methods and ascertain if sieve mesh size influences totals calculated. Finally, this study validates findings from direct observation of feeding by the same individuals from whom the fecal samples had been collected. Contrasting diet composition totals obtained by using different quantification methods and sieve mesh sizes can influence folivory and frugivory estimates. However, our findings were based on the assumption that fibrous matter contained pith and leaf fragments only, which remains to be verified. We advocate macroscopic inspection of feces can be a valuable tool to provide a generalized overview of dietary composition for primate populations. As most populations remain unhabituated, scrutinizing and validating indirect measures are important if they are to be applied to further understand inter- and intra-species dietary variation. PMID:24482001

  14. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    PubMed

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-01-01

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality. PMID:27048884

  15. Juvenile Hormone Extraction, Purification, and Quantification in Ants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Juvenile hormone (JH) is an important insect hormone known to have many effects on development, reproduction, and behavior in both solitary and social insects. A number of questions using ants as a model involve JH. This procedure allows for quantification of circulating levels of JH III, which can ...

  16. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, M.; Aliberti, G.; Palmiotti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  17. THE QUANTIFICATION OF FUNCTIONAL LOAD--A LINGUISTIC PROBLEM.

    ERIC Educational Resources Information Center

    HOCKETT, C.F.

    MEASUREMENT CRITERIA ARE DEVELOPED FOR THE QUANTIFICATION OF THE FUNCTIONAL LOAD OF THE PHONEMES OF A LANGUAGE. THE CONCEPT OF FUNCTIONAL LOAD OR YIELD, FROM CERTAIN THEORIES OF LINGUISTIC CHANGE, STATES THAT SOME CONTRASTS BETWEEN THE DISTINCTIVE SOUNDS OF A LANGUAGE DO MORE WORK THAN OTHERS BY OCCURRING MORE FREQUENTLY AND IN MORE LINGUISTIC…

  18. Quantification of confocal images of biofilms grown on irregular surfaces.

    PubMed

    Sommerfeld Ross, Stacy; Tu, Mai Han; Falsetta, Megan L; Ketterer, Margaret R; Kiedrowski, Megan R; Horswill, Alexander R; Apicella, Michael A; Reinhardt, Joseph M; Fiegel, Jennifer

    2014-05-01

    Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515

  19. A Quantification Approach to Popular American Theatre: Outline.

    ERIC Educational Resources Information Center

    Woods, Alan

    A previously relatively unexplored area of theater history studies is the quantification of titles, authors, and locations of productions of plays in Canada and the United States. Little is known, for example, about the number of times any one play was staged, especially in the earlier days of American drama. A project which counts productions on…

  20. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, M.; Aliberti, G.; Palmiotti, G.

    2015-01-15

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  1. Luminometric Label Array for Quantification and Identification of Metal Ions.

    PubMed

    Pihlasalo, Sari; Montoya Perez, Ileana; Hollo, Niklas; Hokkanen, Elina; Pahikkala, Tapio; Härmä, Harri

    2016-05-17

    Quantification and identification of metal ions has gained interest in drinking water and environmental analyses. We have developed a novel label array method for the quantification and identification of metal ions in drinking water. This simple ready-to-go method is based on the nonspecific interactions of multiple unstable lanthanide chelates and nonantenna ligands with sample leading to a luminescence signal profile, unique to the sample components. The limit of detection at ppb concentration level and average coefficient of variation of 10% were achieved with the developed label array. The identification of 15 different metal ions including different oxidation states Cr(3+)/Cr(6+), Cu(+)/Cu(2+), Fe(2+)/Fe(3+), and Pb(2+)/Pb(4+) was demonstrated. Moreover, a binary mixture of Cu(2+) and Fe(3+) and ternary mixture of Cd(2+), Ni(2+), and Pb(2+) were measured and individual ions were distinguished. PMID:27086705

  2. Quantification of viable helminth eggs in samples of sewage sludge.

    PubMed

    Rocha, Maria Carolina Vieira da; Barés, Monica Eboly; Braga, Maria Cristina Borba

    2016-10-15

    For the application of sewage sludge as fertilizer, it is of fundamental importance the absence of pathogenic organisms, such as viable helminth eggs. Thus, the quantification of these organisms has to be carried out by means of the application of reliable and accurate methodologies. Nevertheless, until the present date, there is no consensus with regard to the adoption of a universal methodology for the detection and quantification of viable helminth eggs. It is therefore necessary to instigate a debate on the different protocols currently in use, as well as to assemble relevant information in order to assist in the development of a more comprehensive and accurate method to quantify viable helminth eggs in samples of sewage sludge and its derivatives. PMID:27470467

  3. Uncertainty Quantification and Validation for RANS Turbulence Models

    NASA Astrophysics Data System (ADS)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  4. Statistical challenges in the quantification of gunshot residue evidence.

    PubMed

    Gauriot, Romain; Gunaratnam, Lawrence; Moroni, Rossana; Reinikainen, Tapani; Corander, Jukka

    2013-09-01

    The discharging of a gun results in the formation of extremely small particles known as gunshot residues (GSR). These may be deposited on the skin and clothing of the shooter, on other persons present, and on nearby items or surfaces. Several factors and their complex interactions affect the number of detectable GSR particles, which can deeply influence the conclusions drawn from likelihood ratios or posterior probabilities for prosecution hypotheses of interest. We present Bayesian network models for casework examples and demonstrate that probabilistic quantification of GSR evidence can be very sensitive to the assumptions concerning the model structure, prior probabilities, and the likelihood components. This finding has considerable implications for the use of statistical quantification of GSR evidence in the legal process. PMID:23822522

  5. Direct immunomagnetic quantification of lymphocyte subsets in blood.

    PubMed Central

    Brinchmann, J E; Vartdal, F; Gaudernack, G; Markussen, G; Funderud, S; Ugelstad, J; Thorsby, E

    1988-01-01

    A method is described where superparamagnetic polymer microspheres coated with monoclonal antibodies (MoAb) are used for the direct and fast quantification of the absolute number of cells of various lymphocyte subsets in blood. Blood samples were incubated with microspheres coated with a subset specific MoAb. Using a magnet the microsphere-rosetted cells were isolated and washed. Following lysis of the cell walls to detach the microspheres, the cell nuclei were stained with acridine orange and counted in a haemocytometer using an immunofluorescence microscope. With MoAb specific for CD2, CD4, CD8 and CD19, reproducible absolute counts of the corresponding lymphocyte subsets were obtained which correlated closely with those obtained by an indirect quantification method. PMID:3349645

  6. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  7. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  8. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  9. Diagnostic utility of droplet digital PCR for HIV reservoir quantification.

    PubMed

    Trypsteen, Wim; Kiselinova, Maja; Vandekerckhove, Linos; De Spiegelaere, Ward

    2016-01-01

    Quantitative real-time PCR (qPCR) is implemented in many molecular laboratories worldwide for the quantification of viral nucleic acids. However, over the last two decades, there has been renewed interest in the concept of digital PCR (dPCR) as this platform offers direct quantification without the need for standard curves, a simplified workflow and the possibility to extend the current detection limit. These benefits are of great interest in terms of the quantification of low viral levels in HIV reservoir research because changes in the dynamics of residual HIV reservoirs will be important to monitor HIV cure efforts. Here, we have implemented a systematic literature screening and text mining approach to map the use of droplet dPCR (ddPCR) in the context of HIV quantification. In addition, several technical aspects of ddPCR were compared with qPCR: accuracy, sensitivity, precision and reproducibility, to determine its diagnostic utility. We have observed that ddPCR was used in different body compartments in multiple HIV-1 and HIV-2 assays, with the majority of reported assays focusing on HIV-1 DNA-based applications (i.e. total HIV DNA). Furthermore, ddPCR showed a higher accuracy, precision and reproducibility, but similar sensitivity when compared to qPCR due to reported false positive droplets in the negative template controls with a need for standardised data analysis (i.e. threshold determination). In the context of a low level of detection and HIV reservoir diagnostics, ddPCR can offer a valid alternative to qPCR-based assays but before this platform can be clinically accredited, some remaining issues need to be resolved. PMID:27482456

  10. Neutron-encoded protein quantification by peptide carbamylation.

    PubMed

    Ulbrich, Arne; Merrill, Anna E; Hebert, Alexander S; Westphall, Michael S; Keller, Mark P; Attie, Alan D; Coon, Joshua J

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet. PMID:24178922

  11. Neutron-encoded protein quantification by peptide carbamylation

    PubMed Central

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2013-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet. PMID:24178922

  12. Universal Quantification in a Constraint-Based Planner

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.

  13. Quantification of Water Absorption and Transport in Parchment

    NASA Astrophysics Data System (ADS)

    Herringer, Susan N.; Bilheux, Hassina Z.; Bearman, Greg

    Neutron radiography was utilized to quantify water absorption and desorption in parchment at the High Flux Isotope Reactor CG-1D imaging facility at Oak Ridge National Laboratory (ORNL). Sequential 60s radiographs of sections of a 15th century parchment were taken as the parchment underwent wetting and drying cycles. This provided time-resolved visualization and quantification of water absorption and transport in parchment.

  14. Incorporating Functional Gene Quantification into Traditional Decomposition Models

    NASA Astrophysics Data System (ADS)

    Todd-Brown, K. E.; Zhou, J.; Yin, H.; Wu, L.; Tiedje, J. M.; Schuur, E. A. G.; Konstantinidis, K.; Luo, Y.

    2014-12-01

    Incorporating new genetic quantification measurements into traditional substrate pool models represents a substantial challenge. These decomposition models are built around the idea that substrate availablity, with environmental drivers, limit carbon dioxide respiration rates. In this paradigm, microbial communities optimally adapt to a given substrate and environment on much shorter time scales then the carbon flux of interest. By characterizing the relative shift in biomass of these microbial communities, we informed previously poorly constrained parameters in traditional decomposition models. In this study we coupled a 9 month laboratory incubation study with quantitative gene measurements with traditional CO2 flux measurements plus initial soil organic carbon quantification. GeoChip 5.0 was used to quantify the functional genes associated with carbon cycling at 2 weeks, 3 months and 9 months. We then combined the genes which 'collapsed' over the experiment and assumed that this tracked the relative change in the biomass associated with the 'fast' pool. We further assumed that this biomass was proportional to the 'fast' SOC pool and thus were able to constrain the relative change in the fast SOC pool in our 3-pool decomposition model. We found that biomass quantification described above, combined with traditional CO2 flux and SOC measurements, improve the transfer coefficient estimation in traditional decomposition models. Transfer coefficients are very difficult to characterized using traditional CO2 flux measurements, thus DNA quantification provides new and significant information about the system. Over a 100 year simulation, these new biologically informed parameters resulted in an additional 10% of SOC loss over the traditionally informed parameters.

  15. Pulsatility of Hypothalamo-Pituitary Hormones: A Challenge in Quantification.

    PubMed

    Keenan, Daniel M; Veldhuis, Johannes D

    2016-01-01

    Neuroendocrine systems control many of the most fundamental physiological processes, e.g., reproduction, growth, adaptations to stress, and metabolism. Each such system involves the hypothalamus, the pituitary, and a specific target gland or organ. In the quantification of the interactions among these components, biostatistical modeling has played an important role. In the present article, five key challenges to an understanding of the interactions of these systems are illustrated and discussed critically. PMID:26674550

  16. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  17. The Challenges of Credible Thermal Protection System Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2013-01-01

    The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.

  18. Near-optimal probabilistic RNA-seq quantification.

    PubMed

    Bray, Nicolas L; Pimentel, Harold; Melsted, Páll; Pachter, Lior

    2016-05-01

    We present kallisto, an RNA-seq quantification program that is two orders of magnitude faster than previous approaches and achieves similar accuracy. Kallisto pseudoaligns reads to a reference, producing a list of transcripts that are compatible with each read while avoiding alignment of individual bases. We use kallisto to analyze 30 million unaligned paired-end RNA-seq reads in <10 min on a standard laptop computer. This removes a major computational bottleneck in RNA-seq analysis. PMID:27043002

  19. Assessment methods for angiogenesis and current approaches for its quantification.

    PubMed

    AlMalki, Waleed Hassan; Shahid, Imran; Mehdi, Abeer Yousaf; Hafeez, Muhammad Hassan

    2014-01-01

    Angiogenesis is a physiological process which describes the development of new blood vessels from the existing vessels. It is a common and the most important process in the formation and development of blood vessels, so it is supportive in the healing of wounds and granulation of tissues. The different assays for the evaluation of angiogenesis have been described with distinct advantages and some limitations. In order to develop angiogenic and antiangiogenic techniques, continuous efforts have been resulted to give animal models for more quantitative analysis of angiogenesis. Most of the studies on angiogenic inducers and inhibitors rely on various models, both in vitro, in vivo and in ova, as indicators of efficacy. The angiogenesis assays are very much helpful to test efficacy of both pro- and anti- angiogenic agents. The development of non-invasive procedures for quantification of angiogenesis will facilitate this process significantly. The main objective of this review article is to focus on the novel and existing methods of angiogenesis and their quantification techniques. These findings will be helpful to establish the most convenient methods for the detection, quantification of angiogenesis and to develop a novel, well tolerated and cost effective anti-angiogenic treatment in the near future. PMID:24987169

  20. In vivo behavior of NTBI revealed by automated quantification system.

    PubMed

    Ito, Satoshi; Ikuta, Katsuya; Kato, Daisuke; Lynda, Addo; Shibusa, Kotoe; Niizeki, Noriyasu; Toki, Yasumichi; Hatayama, Mayumi; Yamamoto, Masayo; Shindo, Motohiro; Iizuka, Naomi; Kohgo, Yutaka; Fujiya, Mikihiro

    2016-08-01

    Non-Tf-bound iron (NTBI), which appears in serum in iron overload, is thought to contribute to organ damage; the monitoring of serum NTBI levels may therefore be clinically useful in iron-overloaded patients. However, NTBI quantification methods remain complex, limiting their use in clinical practice. To overcome the technical difficulties often encountered, we recently developed a novel automated NTBI quantification system capable of measuring large numbers of samples. In the present study, we investigated the in vivo behavior of NTBI in human and animal serum using this newly established automated system. Average NTBI in healthy volunteers was 0.44 ± 0.076 μM (median 0.45 μM, range 0.28-0.66 μM), with no significant difference between sexes. Additionally, serum NTBI rapidly increased after iron loading, followed by a sudden disappearance. NTBI levels also decreased in inflammation. The results indicate that NTBI is a unique marker of iron metabolism, unlike other markers of iron metabolism, such as serum ferritin. Our new automated NTBI quantification method may help to reveal the clinical significance of NTBI and contribute to our understanding of iron overload. PMID:27086349

  1. Accurate mass spectrometry based protein quantification via shared peptides.

    PubMed

    Dost, Banu; Bandeira, Nuno; Li, Xiangqian; Shen, Zhouxin; Briggs, Steven P; Bafna, Vineet

    2012-04-01

    In mass spectrometry-based protein quantification, peptides that are shared across different protein sequences are often discarded as being uninformative with respect to each of the parent proteins. We investigate the use of shared peptides which are ubiquitous (~50% of peptides) in mass spectrometric data-sets for accurate protein identification and quantification. Different from existing approaches, we show how shared peptides can help compute the relative amounts of the proteins that contain them. Also, proteins with no unique peptide in the sample can still be analyzed for relative abundance. Our article uses shared peptides in protein quantification and makes use of combinatorial optimization to reduce the error in relative abundance measurements. We describe the topological and numerical properties required for robust estimates, and use them to improve our estimates for ill-conditioned systems. Extensive simulations validate our approach even in the presence of experimental error. We apply our method to a model of Arabidopsis thaliana root knot nematode infection, and investigate the differential role of several protein family members in mediating host response to the pathogen. PMID:22414154

  2. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    PubMed Central

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  3. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  4. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  5. Theoretical limitations of quantification for noncompetitive sandwich immunoassays.

    PubMed

    Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom

    2015-11-01

    Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations. PMID:26342315

  6. Assessment methods for angiogenesis and current approaches for its quantification

    PubMed Central

    AlMalki, Waleed Hassan; Shahid, Imran; Mehdi, Abeer Yousaf; Hafeez, Muhammad Hassan

    2014-01-01

    Angiogenesis is a physiological process which describes the development of new blood vessels from the existing vessels. It is a common and the most important process in the formation and development of blood vessels, so it is supportive in the healing of wounds and granulation of tissues. The different assays for the evaluation of angiogenesis have been described with distinct advantages and some limitations. In order to develop angiogenic and antiangiogenic techniques, continuous efforts have been resulted to give animal models for more quantitative analysis of angiogenesis. Most of the studies on angiogenic inducers and inhibitors rely on various models, both in vitro, in vivo and in ova, as indicators of efficacy. The angiogenesis assays are very much helpful to test efficacy of both pro- and anti- angiogenic agents. The development of non-invasive procedures for quantification of angiogenesis will facilitate this process significantly. The main objective of this review article is to focus on the novel and existing methods of angiogenesis and their quantification techniques. These findings will be helpful to establish the most convenient methods for the detection, quantification of angiogenesis and to develop a novel, well tolerated and cost effective anti-angiogenic treatment in the near future. PMID:24987169

  7. Gas plume quantification in downlooking hyperspectral longwave infrared images

    NASA Astrophysics Data System (ADS)

    Turcotte, Caroline S.; Davenport, Michael R.

    2010-10-01

    Algorithms have been developed to support quantitative analysis of a gas plume using down-looking airborne hyperspectral long-wave infrared (LWIR) imagery. The resulting gas quantification "GQ" tool estimates the quantity of one or more gases at each pixel, and estimates uncertainty based on factors such as atmospheric transmittance, background clutter, and plume temperature contrast. GQ uses gas-insensitive segmentation algorithms to classify the background very precisely so that it can infer gas quantities from the differences between plume-bearing pixels and similar non-plume pixels. It also includes MODTRAN-based algorithms to iteratively assess various profiles of air temperature, water vapour, and ozone, and select the one that implies smooth emissivity curves for the (unknown) materials on the ground. GQ then uses a generalized least-squares (GLS) algorithm to simultaneously estimate the most likely mixture of background (terrain) material and foreground plume gases. Cross-linking of plume temperature to the estimated gas quantity is very non-linear, so the GLS solution was iteratively assessed over a range of plume temperatures to find the best fit to the observed spectrum. Quantification errors due to local variations in the camera-topixel distance were suppressed using a subspace projection operator. Lacking detailed depth-maps for real plumes, the GQ algorithm was tested on synthetic scenes generated by the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software. Initial results showed pixel-by-pixel gas quantification errors of less than 15% for a Freon 134a plume.

  8. Zonated quantification of steatosis in an entire mouse liver.

    PubMed

    Schwen, Lars Ole; Homeyer, André; Schwier, Michael; Dahmen, Uta; Dirsch, Olaf; Schenk, Arne; Kuepfer, Lars; Preusser, Tobias; Schenk, Andrea

    2016-06-01

    Many physiological processes and pathological conditions in livers are spatially heterogeneous, forming patterns at the lobular length scale or varying across the organ. Steatosis, a common liver disease characterized by lipids accumulating in hepatocytes, exhibits heterogeneity at both these spatial scales. The main goal of the present study was to provide a method for zonated quantification of the steatosis patterns found in an entire mouse liver. As an example application, the results were employed in a pharmacokinetics simulation. For the analysis, an automatic detection of the lipid vacuoles was used in multiple slides of histological serial sections covering an entire mouse liver. Lobuli were determined semi-automatically and zones were defined within the lobuli. Subsequently, the lipid content of each zone was computed. The steatosis patterns were found to be predominantly periportal, with a notable organ-scale heterogeneity. The analysis provides a quantitative description of the extent of steatosis in unprecedented detail. The resulting steatosis patterns were successfully used as a perturbation to the liver as part of an exemplary whole-body pharmacokinetics simulation for the antitussive drug dextromethorphan. The zonated quantification is also applicable to other pathological conditions that can be detected in histological images. Besides being a descriptive research tool, this quantification could perspectively complement diagnosis based on visual assessment of histological images. PMID:27104496

  9. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. PMID:26179515

  10. Nuclear and mitochondrial DNA quantification of various forensic materials.

    PubMed

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials. PMID:16427750

  11. Systematic Assessment of RNA-Seq Quantification Tools Using Simulated Sequence Data

    PubMed Central

    Chandramohan, Raghu; Wu, Po-Yen; Phan, John H.; Wang, May D.

    2016-01-01

    RNA-sequencing (RNA-seq) technology has emerged as the preferred method for quantification of gene and isoform expression. Numerous RNA-seq quantification tools have been proposed and developed, bringing us closer to developing expression-based diagnostic tests based on this technology. However, because of the rapidly evolving technologies and algorithms, it is essential to establish a systematic method for evaluating the quality of RNA-seq quantification. We investigate how different RNA-seq experimental designs (i.e., variations in sequencing depth and read length) affect various quantification algorithms (i.e., HTSeq, Cufflinks, and MISO). Using simulated data, we evaluate the quantification tools based on four metrics, namely: (1) total number of usable fragments for quantification, (2) detection of genes and isoforms, (3) correlation, and (4) accuracy of expression quantification with respect to the ground truth. Results show that Cufflinks is able to use the largest number of fragments for quantification, leading to better detection of genes and isoforms. However, HTSeq produces more accurate expression estimates. Moreover, each quantification algorithm is affected differently by varying sequencing depth and read length, suggesting that the selection of quantification algorithms should be application-dependent.

  12. Systematic development of a group quantification method using evaporative light scattering detector for relative quantification of ginsenosides in ginseng products.

    PubMed

    Lee, Gwang Jin; Shin, Byong-Kyu; Yu, Yun-Hyun; Ahn, Jongsung; Kwon, Sung Won; Park, Jeong Hill

    2016-09-01

    The determination for the contents of multi-components in ginseng products has come to the fore by demands of in-depth information, but the associated industries confront the high cost of securing pure standards for the continuous quality evaluation of the products. This study aimed to develop a prospective high-performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method for relative quantification of ginsenosides in ginseng products without a considerable change from the conventional gradient analysis. We investigated the effects of mobile phase composition and elution bandwidth, which are potential variables affecting the ELSD response in the gradient analysis. Similar ELSD response curves of nine major ginsenosides were obtained under the identical flow injection conditions, and the response increased as the percentage of organic solvent increased. The nine ginsenosides were divided into three groups to confirm the effect of elution bandwidth. The ELSD response significantly decreased in case of the late eluted ginsenoside in the individual groups under the isocratic conditions. With the consideration of the two important effects, stepwise changes of the gradient condition were carried out to reach a group quantification method. The inconsistent responses of the nine ginsenosides were reconstituted to three normalized responses by the stepwise changes of the gradient condition, and this result actualized relative quantification in the individual groups. The availability was confirmed by comparing the ginsenoside contents in a base material of ginseng products determined by the direct and group quantification method. The largest difference in the determination results from the two methods was 8.26%, and the difference of total contents was only 0.91%. PMID:27262109

  13. Evaluation of a new method for stenosis quantification from 3D x-ray angiography images

    NASA Astrophysics Data System (ADS)

    Betting, Fabienne; Moris, Gilles; Knoplioch, Jerome; Trousset, Yves L.; Sureda, Francisco; Launay, Laurent

    2001-05-01

    A new method for stenosis quantification from 3D X-ray angiography images has been evaluated on both phantom and clinical data. On phantoms, for the parts larger or equal to 3 mm, the standard deviation of the measurement error has always found to be less or equal to 0.4 mm, and the maximum measurement error less than 0.17 mm. No clear relationship has been observed between the performances of the quantification method and the acquisition FoV. On clinical data, the 3D quantification method proved to be more robust to vessel bifurcations than its 3D equivalent. On a total of 15 clinical cases, the differences between 2D and 3D quantification were always less than 0.7 mm. The conclusion is that stenosis quantification from 3D X-4ay angiography images is an attractive alternative to quantification from 2D X-ray images.

  14. The variability of manual and computer assisted quantification of multiple sclerosis lesion volumes.

    PubMed

    Mitchell, J R; Karlik, S J; Lee, D H; Eliasziw, M; Rice, G P; Fenster, A

    1996-01-01

    The high resolution and excellent soft tissue contrast of Magnetic Resonance Imaging (MRI) have enabled direct, noninvasive visualization of Multiple Sclerosis (MS) lesions in vivo. This has allowed the quantification of changes in the appearance of lesions in MR exams to be used as a measure of disease state. Nevertheless, accurate quantification techniques are subject to inter- and intra-operator variability, which may hinder monitoring of disease progression. We have developed a computer program to assist an experienced operator in the quantification of MS lesions in standard spin-echo MR exams. The accuracy of assisted and manual quantification under known conditions was studied using exams of a test phantom, while inter- and intra-operator reliability and variability were studied using exams of a MS patient. Results from the phantom study show that accuracy is improved by assisted quantification. The patient exam results indicate that assisted quantification reduced inter-operator variability from 0.34 to 0.17 cm3, and reduced intra-operator variability from 0.23 to 0.15 cm3. In addition, the minimum significant change between two successive measurements of lesion volume by the same operator was 0.64 cm3 for manual quantification and 0.42 cm3 for assisted quantification. For two different operators making successive measurements, the minimum significant change was 0.94 cm3 for manual quantification, but only 0.47 cm3 for assisted quantification. Finally, the number of lesions to be monitored for an average change in volume at a given power and significance level was reduced by a factor of 2-4 by assisted quantification. These results suggest that assisted quantification may have practical applications in clinical trials, especially those that are large, multicenter, or extended over time, and therefore require lesion measurements by one or more operators. PMID:8700036

  15. Quantification of Hepatic Steatosis With Dual-Energy Computed Tomography

    PubMed Central

    Artz, Nathan S.; Hines, Catherine D.G.; Brunner, Stephen T.; Agni, Rashmi M.; Kühn, Jens-Peter; Roldan-Alzate, Alejandro; Chen, Guang-Hong; Reeder, Scott B.

    2012-01-01

    Objective The aim of this study was to compare dual-energy computed tomography (DECT) and magnetic resonance imaging (MRI) for fat quantification using tissue triglyceride concentration and histology as references in an animal model of hepatic steatosis. Materials and Methods This animal study was approved by our institution's Research Animal Resource Center. After validation of DECT and MRI using a phantom consisting of different triglyceride concentrations, a leptin-deficient obese mouse model (ob/ob) was used for this study. Twenty mice were divided into 3 groups based on expected levels of hepatic steatosis: low (n = 6), medium (n = 7), and high (n = 7) fat. After MRI at 3 T, a DECT scan was immediately performed. The caudate lobe of the liver was harvested and analyzed for triglyceride concentration using a colorimetric assay. The left lateral lobe was also extracted for histology. Magnetic resonance imaging fat-fraction (FF) and DECT measurements (attenuation, fat density, and effective atomic number) were compared with triglycerides and histology. Results Phantom results demonstrated excellent correlation between triglyceride content and each of the MRI and DECT measurements (r2 ≥ 0.96, P ≤ 0.003). In vivo, however, excellent triglyceride correlation was observed only with attenuation (r2 = 0.89, P < 0.001) and MRI-FF (r2 = 0.92, P < 0.001). Strong correlation existed between attenuation and MRI-FF (r2 = 0.86, P < 0.001). Nonlinear correlation with histology was also excellent for attenuation and MRI-FF. Conclusions Dual-energy computed tomography (CT) data generated by the current Gemstone Spectral Imaging analysis tool do not improve the accuracy of fat quantification in the liver beyond what CT attenuation can already provide. Furthermore, MRI may provide an excellent reference standard for liver fat quantification when validating new CT or DECT methods in human subjects. PMID:22836309

  16. Automated epicardial fat volume quantification from non-contrast CT

    NASA Astrophysics Data System (ADS)

    Ding, Xiaowei; Terzopoulos, Demetri; Diaz-Zamudio, Mariana; Berman, Daniel S.; Slomka, Piotr J.; Dey, Damini

    2014-03-01

    Epicardial fat volume (EFV) is now regarded as a significant imaging biomarker for cardiovascular risk strat-ification. Manual or semi-automated quantification of EFV includes tedious and careful contour drawing of pericardium on fine image features. We aimed to develop and validate a fully-automated, accurate algorithm for EVF quantification from non-contrast CT using active contours and multiple atlases registration. This is a knowledge-based model that can segment both the heart and pericardium accurately by initializing the location and shape of the heart in large scale from multiple co-registered atlases and locking itself onto the pericardium actively. The deformation process is driven by pericardium detection, extracting only the white contours repre- senting the pericardium in the CT images. Following this step, we can calculate fat volume within this region (epicardial fat) using standard fat attenuation range. We validate our algorithm on CT datasets from 15 patients who underwent routine assessment of coronary calcium. Epicardial fat volume quantified by the algorithm (69.15 +/- 8.25 cm3) and the expert (69.46 +/- 8.80 cm3) showed excellent correlation (r = 0.96, p < 0.0001) with no significant differences by comparison of individual data points (p = 0.9). The algorithm achieved a Dice overlap of 0.93 (range 0.88 - 0.95). The total time was less than 60 sec on a standard windows computer. Our results show that fast accurate automated knowledge-based quantification of epicardial fat volume from non-contrast CT is feasible. To our knowledge, this is also the first fully automated algorithms reported for this task.

  17. Quantification of breast arterial calcification using full field digital mammography

    SciTech Connect

    Molloi, Sabee; Xu Tong; Ducote, Justin; Iribarren, Carlos

    2008-04-15

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K-0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  18. Quantification of Carnosine-Aldehyde Adducts in Human Urine.

    PubMed

    da Silva Bispo, Vanderson; Di Mascio, Paolo; Medeiros, Marisa

    2014-10-01

    Lipid peroxidation generates several reactive carbonyl species, including 4-hydroxy-2-nonenal (HNE), acrolein (ACR), 4-hydroxy-2-hexenal (HHE) and malondialdehyde. One major pathwayof aldehydes detoxification is through conjugation with glutathione catalyzed by glutathione-S-transferases or, alternatively, by conjugation with endogenous histidine containing dipeptides, such as carnosine (CAR). In this study, on-line reverse-phase high-performance liquid chromatography (HPLC) separation with tandem mass spectrometry detection was utilized for the accurate quantification of CAR- ACR, CAR-HHE and CAR-HNE adducts in human urinary samples from non-smokers young adults. Standard adducts were prepared and isolated by HPLC. The results showed the presence of a new product from the reaction of CAR with ACR. This new adduct was completely characterized by HPLC/MS-MSn, 1H RMN, COSY and HSQC. The new HPLC/MS/MS methodology employing stable isotope-labeled internal standards (CAR-HHEd5 and CAR-HNEd11) was developed for adducts quantification. This methodology permits quantification of 10pmol CAR-HHE and 1pmol of CAR-ACR and CAR-HNE. Accurate determinations in human urine sample were performed and showed 4.65±1.71 to CAR-ACR, 5.13±1.76 to CAR-HHE and 5.99±3.19nmol/mg creatinine to CAR-HNE. Our results indicate that carnosine pathways can be an important detoxification route of a, ß -unsaturated aldehydes. Moreover, carnosine adducts may be useful as redox stress indicator. PMID:26461323

  19. A flexible numerical approach for quantification of epistemic uncertainty

    SciTech Connect

    Chen, Xiaoxiao; Park, Eun-Jae; Xiu, Dongbin

    2013-05-01

    In the field of uncertainty quantification (UQ), epistemic uncertainty often refers to the kind of uncertainty whose complete probabilistic description is not available, largely due to our lack of knowledge about the uncertainty. Quantification of the impacts of epistemic uncertainty is naturally difficult, because most of the existing stochastic tools rely on the specification of the probability distributions and thus do not readily apply to epistemic uncertainty. And there have been few studies and methods to deal with epistemic uncertainty. A recent work can be found in [J. Jakeman, M. Eldred, D. Xiu, Numerical approach for quantification of epistemic uncertainty, J. Comput. Phys. 229 (2010) 4648–4663], where a framework for numerical treatment of epistemic uncertainty was proposed. The method is based on solving an encapsulation problem, without using any probability information, in a hypercube that encapsulates the unknown epistemic probability space. If more probabilistic information about the epistemic variables is known a posteriori, the solution statistics can then be evaluated at post-process steps. In this paper, we present a new method, similar to that of Jakeman et al. but significantly extending its capabilities. Most notably, the new method (1) does not require the encapsulation problem to be in a bounded domain such as a hypercube; (2) does not require the solution of the encapsulation problem to converge point-wise. In the current formulation, the encapsulation problem could reside in an unbounded domain, and more importantly, its numerical approximation could be sought in L{sup p} norm. These features thus make the new approach more flexible and amicable to practical implementation. Both the mathematical framework and numerical analysis are presented to demonstrate the effectiveness of the new approach.

  20. A flexible numerical approach for quantification of epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoxiao; Park, Eun-Jae; Xiu, Dongbin

    2013-05-01

    In the field of uncertainty quantification (UQ), epistemic uncertainty often refers to the kind of uncertainty whose complete probabilistic description is not available, largely due to our lack of knowledge about the uncertainty. Quantification of the impacts of epistemic uncertainty is naturally difficult, because most of the existing stochastic tools rely on the specification of the probability distributions and thus do not readily apply to epistemic uncertainty. And there have been few studies and methods to deal with epistemic uncertainty. A recent work can be found in [J. Jakeman, M. Eldred, D. Xiu, Numerical approach for quantification of epistemic uncertainty, J. Comput. Phys. 229 (2010) 4648-4663], where a framework for numerical treatment of epistemic uncertainty was proposed. The method is based on solving an encapsulation problem, without using any probability information, in a hypercube that encapsulates the unknown epistemic probability space. If more probabilistic information about the epistemic variables is known a posteriori, the solution statistics can then be evaluated at post-process steps. In this paper, we present a new method, similar to that of Jakeman et al. but significantly extending its capabilities. Most notably, the new method (1) does not require the encapsulation problem to be in a bounded domain such as a hypercube; (2) does not require the solution of the encapsulation problem to converge point-wise. In the current formulation, the encapsulation problem could reside in an unbounded domain, and more importantly, its numerical approximation could be sought in Lp norm. These features thus make the new approach more flexible and amicable to practical implementation. Both the mathematical framework and numerical analysis are presented to demonstrate the effectiveness of the new approach.

  1. Uncertainty quantification for characterization of high enthalpy facilities

    NASA Astrophysics Data System (ADS)

    Villedieu, N.; Cappaert, J.; Garcia Galache, J. P.; Magin, T. E.

    2013-06-01

    The postflight analysis of a space mission requires accurate determination of the free-stream conditions for the trajectory. The Mach number, temperature, and pressure conditions can be rebuilt from the heat flux and pressure measured on the spacecraft by means of a Flush Air Data System (FADS). This instrumentation comprises a set of sensors flush mounted in the thermal protection system to measure the static pressure (pressure taps) and heat flux (calorimeters). Knowing that experimental data suffer from errors, this methodology needs to integrate quantification of uncertainties. Epistemic uncertainties on the models for chemistry in the bulk and at the wall (surface catalysis) should also be taken into account. To study this problem it is necessary to solve a stochastic backward problem. This paper focuses on a preliminary sensitivity analysis of the forward problem to understand which uncertainties need to be accounted for. In section 2, the uncertainty quantification methodologies used in this work are presented. Section 3 is dedicated to the one-dimensional (1D) simulations of the shock layer to identify which chemical reactions of the mechanism need to be accounted for in the Uncertainty Quantification (UQ). After this triage procedure, the two-dimensional (2D) axisymmetric flow around the blunt nose was simulated for two trajectory points of EXPERT (EXPErimental Reentry Test-bed) is simulated and the propagation of the uncertainties on the stagnation pressure and heat flux has been studied. To do this study, the open source software DAKOTA from Sandia National Laboratory [1] is coupled with two in-house codes: SHOCKING that simulates the evolution of the chemical relaxation in the shock layer [2], and COSMIC that simulates axisymmetric chemically reacting flows [3].

  2. [Quantification of levels of serum antirabies antibodies in vaccinated individuals].

    PubMed

    Süliová, J; Benísek, Z; Svrcek, S; Durove, A; Závadová, J

    1994-02-01

    The authors developed a kit for the purpose of assessment of anti-rabies antibodies by the ELISA immunoenzymatic method in human immunized sera. The results of the detection and quantification of anti-rabies antibodies acquired by the ELISA method were compared with those originating from classical procedures (virusneutralizing test on mice, indirect hemagglutination test), and a sufficient correlation and sensitivity of the immunoenzymatic method were detected. By means of the developed test it is possible to detect the particular level of anti-rabies virusneutralizing IgG antibodies. (Tab. 2, Fig. 1, Ref. 25). PMID:7922630

  3. Recurrence plots and recurrence quantification analysis of human motion data

    NASA Astrophysics Data System (ADS)

    Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad

    2016-06-01

    The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.

  4. Quantification of skin wrinkles using low coherence interferometry

    NASA Astrophysics Data System (ADS)

    Oh, Jung-Taek; Kim, Beop-Min; Son, Sang-Ryoon; Lee, Sang-Won; Kim, Dong-Yoon; Kim, Youn-Soo

    2004-07-01

    We measure the skin wrinkle topology by means of low coherence interferometry (LCI), which forms the basis of the optical coherence tomography (OCT). The skin topology obtained using LCI and corresponding 2-D fast Fourier transform allow quantification of skin wrinkles. It took approximately 2 minutes to obtain 2.1 mm x 2.1 mm topological image with 4 um and 16 um resolutions in axial and transverse directions, respectively. Measurement examples show the particular case of skin contour change after-wrinkle cosmeceutical treatments and atopic dermatitis

  5. Quantification of risks from technology for improved plant reliability

    SciTech Connect

    Rode, D.M.

    1996-12-31

    One of the least understood and therefore appreciated threats to profitability are risks from power plant technologies such as steam generators, turbines, and electrical systems. To effectively manage technological risks, business decisions need to be based on knowledge. The scope of the paper describes a quantification or risk process that combines technical knowledge and judgments with commercial consequences. The three principle alternatives to manage risks as well as risk mitigation techniques for significant equipment within a power plant are reported. The result is to equip the decision maker with a comprehensive picture of the risk exposures enabling cost effective activities to be undertaken to improve a plant`s reliability.

  6. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    NASA Technical Reports Server (NTRS)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  7. Nuclear Data Uncertainty Quantification: Past, Present and Future

    SciTech Connect

    Smith, D.L.

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  8. Quantification of protein concentration using UV absorbance and Coomassie dyes.

    PubMed

    Noble, James E

    2014-01-01

    The measurement of a solubilized protein concentration in solution is an important assay in biochemistry research and development labs for applications ranging from enzymatic studies to providing data for biopharmaceutical lot release. Spectrophotometric protein quantification assays are methods that use UV and visible spectroscopy to rapidly determine the concentration of protein, relative to a standard, or using an assigned extinction coefficient. Where multiple samples need measurement, and/or the sample volume and concentration is limited, preparations of the Coomassie dye commonly known as the Bradford assay can be used. PMID:24423263

  9. Progressive damage state evolution and quantification in composites

    NASA Astrophysics Data System (ADS)

    Patra, Subir; Banerjee, Sourav

    2016-04-01

    Precursor damage state quantification can be helpful for safety and operation of aircraft and defense equipment's. Damage develops in the composite material in the form of matrix cracking, fiber breakages and deboning, etc. However, detection and quantification of the damage modes at their very early stage is not possible unless modifications of the existing indispensable techniques are conceived, particularly for the quantification of multiscale damages at their early stage. Here, we present a novel nonlocal mechanics based damage detection technique for precursor damage state quantification. Micro-continuum physics is used by modifying the Christoffel equation. American society of testing and materials (ASTM) standard woven carbon fiber (CFRP) specimens were tested under Tension-Tension fatigue loading at the interval of 25,000 cycles until 500,000 cycles. Scanning Acoustic Microcopy (SAM) and Optical Microscopy (OM) were used to examine the damage development at the same interval. Surface Acoustic Wave (SAW) velocity profile on a representative volume element (RVE) of the specimen were calculated at the regular interval of 50,000 cycles. Nonlocal parameters were calculated form the micromorphic wave dispersion curve at a particular frequency of 50 MHz. We used a previously formulated parameter called "Damage entropy" which is a measure of the damage growth in the material calculated with the loading cycle. Damage entropy (DE) was calculated at every pixel on the RVE and the mean of DE was plotted at the loading interval of 25,000 cycle. Growth of DE with fatigue loading cycles was observed. Optical Imaging also performed at the interval of 25,000 cycles to investigate the development of damage inside the materials. We also calculated the mean value of the Surface Acoustic Wave (SAW) velocity and plotted with fatigue cycle which is correlated further with Damage Entropy (DE). Statistical analysis of the Surface Acoustic Wave profile (SAW) obtained at different

  10. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  11. Predicting human age with bloodstains by sjTREC quantification.

    PubMed

    Ou, Xue-ling; Gao, Jun; Wang, Huan; Wang, Hong-sheng; Lu, Hui-ling; Sun, Hong-yu

    2012-01-01

    The age-related decline of signal joint T-cell receptor rearrangement excision circles (sjTRECs) in human peripheral blood has been demonstrated in our previous study and other reports. Until now, only a few studies on sjTREC detection in bloodstain samples were reported, which were based on a small sample of subjects of a limited age range, although bloodstains are much more frequently encountered in forensic practice. In this present study, we adopted the sensitive Taqman real-time quantitative polymerase chain reaction (qPCR) method to perform sjTREC quantification in bloodstains from individuals ranging from 0-86 years old (n = 264). The results revealed that sjTREC contents in human bloodstains were declined in an age-dependent manner (r = -0.8712). The formula of age estimation was Age = -7.1815Y-42.458 ± 9.42 (Y dCt(TBP-sjTREC); 9.42 standard error). Furthermore, we tested for the influence of short- or long- storage time by analyzing fresh and stored bloodstains from the same individuals. Remarkably, no statistically significant difference in sjTREC contents was found between the fresh and old DNA samples over a 4-week of storage time. However, significant loss (0.16-1.93 dCt) in sjTREC contents was detected after 1.5 years of storage in 31 samples. Moreover, preliminary sjTREC quantification from up to 20-year-old bloodstains showed that though the sjTREC contents were detectable in all samples and highly correlated with donor age, a time-dependent decrease in the correlation coefficient r was found, suggesting the predicting accuracy of this described assay would be deteriorated in aged samples. Our findings show that sjTREC quantification might be also suitable for age prediction in bloodstains, and future researches into the time-dependent or other potential impacts on sjTREC quantification might allow further improvement of the predicting accuracy. PMID:22879970

  12. Experimental validation of equations for 2D DIC uncertainty quantification.

    SciTech Connect

    Reu, Phillip L.; Miller, Timothy J.

    2010-03-01

    Uncertainty quantification (UQ) equations have been derived for predicting matching uncertainty in two-dimensional image correlation a priori. These equations include terms that represent the image noise and image contrast. Researchers at the University of South Carolina have extended previous 1D work to calculate matching errors in 2D. These 2D equations have been coded into a Sandia National Laboratories UQ software package to predict the uncertainty for DIC images. This paper presents those equations and the resulting error surfaces for trial speckle images. Comparison of the UQ results with experimentally subpixel-shifted images is also discussed.

  13. Development of magnetic resonance technology for noninvasive boron quantification

    SciTech Connect

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.

  14. Quantification of quantum discord in a antiferromagnetic Heisenberg compound

    SciTech Connect

    Singh, H. Chakraborty, T. Mitra, C.

    2014-04-24

    An experimental quantification of concurrence and quantum discord from heat capacity (C{sub p}) measurement performed over a solid state system has been reported. In this work, thermodynamic measurements were performed on copper nitrate (CN, Cu(NO{sub 3}){sub 2}⋅2.5H{sub 2}O) single crystals which is an alternating antiferromagnet Heisenberg spin 1/2 system. CN being a weak dimerized antiferromagnet is an ideal system to investigate correlations between spins. The theoretical expressions were used to obtain concurrence and quantum discord curves as a function of temperature from heat capacity data of a real macroscopic system, CN.

  15. Aspect-Oriented Programming is Quantification and Implicit Invocation

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Koga, Dennis (Technical Monitor)

    2001-01-01

    We propose that the distinguishing characteristic of Aspect-Oriented Programming (AOP) languages is that they allow programming by making quantified programmatic assertions over programs that lack local notation indicating the invocation of these assertions. This suggests that AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the interactions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are metabolism: they are sufficiently expressive to allow straightforwardly programming an AOP system within them.

  16. Uncertainty Quantification in Fission Cross Section Measurements at LANSCE

    SciTech Connect

    Tovesson, F.

    2015-01-15

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  17. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    SciTech Connect

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene; Malinas, Rebecca; Peterson, Matthew Gregor; Phillips, Cynthia A.; Robinson, David G.; Woodbridge, Diane

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  18. Current peptidomics: Applications, purification, identification, quantification, and functional analysis

    PubMed Central

    Dallas, David C.; Guerrero, Andres; Parker, Evan A.; Robinson, Randall C.; Gan, Junai; German, J. Bruce; Barile, Daniela; Lebrilla, Carlito B.

    2015-01-01

    Peptidomics is an emerging field branching from proteomics that targets endogenously produced protein fragments. Endogenous peptides are often functional within the body—and can be both beneficial and detrimental. This review covers the use of peptidomics in understanding digestion, and identifying functional peptides and biomarkers. Various techniques for peptide and glycopeptide extraction, both at analytical and preparative scales, and available options for peptide detection with MS are discussed. Current algorithms for peptide sequence determination, and both analytical and computational techniques for quantification are compared. Techniques for statistical analysis, sequence mapping, enzyme prediction, and peptide function, and structure prediction are explored. PMID:25429922

  19. Source-Code Instrumentation and Quantification of Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Aspect Oriented Programming (AOP) is making quantified programmatic assertions over programs that otherwise are not annotated to receive these assertions. Varieties of AOP systems are characterized by which quantified assertions they allow, what they permit in the actions of the assertions (including how the actions interact with the base code), and what mechanisms they use to achieve the overall effect. Here, we argue that all quantification is over dynamic events, and describe our preliminary work in developing a system that maps dynamic events to transformations over source code. We discuss possible applications of this system, particularly with respect to debugging concurrent systems.

  20. Current peptidomics: applications, purification, identification, quantification, and functional analysis.

    PubMed

    Dallas, David C; Guerrero, Andres; Parker, Evan A; Robinson, Randall C; Gan, Junai; German, J Bruce; Barile, Daniela; Lebrilla, Carlito B

    2015-03-01

    Peptidomics is an emerging field branching from proteomics that targets endogenously produced protein fragments. Endogenous peptides are often functional within the body-and can be both beneficial and detrimental. This review covers the use of peptidomics in understanding digestion, and identifying functional peptides and biomarkers. Various techniques for peptide and glycopeptide extraction, both at analytical and preparative scales, and available options for peptide detection with MS are discussed. Current algorithms for peptide sequence determination, and both analytical and computational techniques for quantification are compared. Techniques for statistical analysis, sequence mapping, enzyme prediction, and peptide function, and structure prediction are explored. PMID:25429922

  1. Uncertainty quantification in fission cross section measurements at LANSCE

    SciTech Connect

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  2. Quantification of toxicological effects for dichloromethane. Final report

    SciTech Connect

    Not Available

    1992-01-01

    The document discusses the quantification of non-carcinogenic effects and carcinogenic effects for dichloromethane. The evaluation of non-carcinogenic effects includes a study of short and long term effects in animals and humans, as well as the development of the one-day, ten-day, and long term health advisories. The evaluation of carcinogenic effects includes a categorization of carcinogenic potential and risks estimates. There is a brief discussion on existing guidelines or standards and special considerations such as high risk groups.

  3. Prospective Comparison of Liver Stiffness Measurements between Two Point Shear Wave Elastography Methods: Virtual Touch Quantification and Elastography Point Quantification

    PubMed Central

    Yoo, Hyunsuk; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo

    2016-01-01

    Objective To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Materials and Methods Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ2 analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). Results The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Conclusion Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement. PMID:27587964

  4. Selected Reaction Monitoring Mass Spectrometry for Absolute Protein Quantification.

    PubMed

    Manes, Nathan P; Mann, Jessica M; Nita-Lazar, Aleksandra

    2015-01-01

    Absolute quantification of target proteins within complex biological samples is critical to a wide range of research and clinical applications. This protocol provides step-by-step instructions for the development and application of quantitative assays using selected reaction monitoring (SRM) mass spectrometry (MS). First, likely quantotypic target peptides are identified based on numerous criteria. This includes identifying proteotypic peptides, avoiding sites of posttranslational modification, and analyzing the uniqueness of the target peptide to the target protein. Next, crude external peptide standards are synthesized and used to develop SRM assays, and the resulting assays are used to perform qualitative analyses of the biological samples. Finally, purified, quantified, heavy isotope labeled internal peptide standards are prepared and used to perform isotope dilution series SRM assays. Analysis of all of the resulting MS data is presented. This protocol was used to accurately assay the absolute abundance of proteins of the chemotaxis signaling pathway within RAW 264.7 cells (a mouse monocyte/macrophage cell line). The quantification of Gi2 (a heterotrimeric G-protein α-subunit) is described in detail. PMID:26325288

  5. Automated quantification of nuclear immunohistochemical markers with different complexity.

    PubMed

    López, Carlos; Lejeune, Marylène; Salvadó, María Teresa; Escrivà, Patricia; Bosch, Ramón; Pons, Lluis E; Alvaro, Tomás; Roig, Jordi; Cugat, Xavier; Baucells, Jordi; Jaén, Joaquín

    2008-03-01

    Manual quantification of immunohistochemically stained nuclear markers is still laborious and subjective and the use of computerized systems for digital image analysis have not yet resolved the problems of nuclear clustering. In this study, we designed a new automatic procedure for quantifying various immunohistochemical nuclear markers with variable clustering complexity. This procedure consisted of two combined macros. The first, developed with a commercial software, enabled the analysis of the digital images using color and morphological segmentation including a masking process. All information extracted with this first macro was automatically exported to an Excel datasheet, where a second macro composed of four different algorithms analyzed all the information and calculated the definitive number of positive nuclei for each image. One hundred and eighteen images with different levels of clustering complexity was analyzed and compared with the manual quantification obtained by a trained observer. Statistical analysis indicated a great reliability (intra-class correlation coefficient > 0.950) and no significant differences between the two methods. Bland-Altman plot and Kaplan-Meier curves indicated that the results of both methods were concordant around 90% of analyzed images. In conclusion, this new automated procedure is an objective, faster and reproducible method that has an excellent level of accuracy, even with digital images with a high complexity. PMID:18172664

  6. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    PubMed

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets. PMID:27031878

  7. A Spanish model for quantification and management of construction waste.

    PubMed

    Solís-Guzmán, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramírez-de-Arellano, Antonio

    2009-09-01

    Currently, construction and demolition waste (C&D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C&D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C&D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C&D waste volume in both new construction and demolition projects. PMID:19523801

  8. A Spanish model for quantification and management of construction waste

    SciTech Connect

    Solis-Guzman, Jaime Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramirez-de-Arellano, Antonio

    2009-09-15

    Currently, construction and demolition waste (C and D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C and D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C and D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C and D waste volume in both new construction and demolition projects.

  9. Concurrent quantification of tryptophan and its major metabolites

    PubMed Central

    Lesniak, Wojciech G.; Jyoti, Amar; Mishra, Manoj K.; Louissaint, Nicolette; Romero, Roberto; Chugani, Diane C.; Kannan, Sujatha; Kannan, Rangaramanujam M.

    2014-01-01

    An imbalance in tryptophan (TRP) metabolites is associated with several neurological and inflammatory disorders. Therefore, analytical methods allowing for simultaneous quantification of TRP and its major metabolites would be highly desirable, and may be valuable as potential biomarkers. We have developed a HPLC method for concurrent quantitative determination of tryptophan, serotonin, 5-hydroxyindoleacetic acid, kynurenine, and kynurenic acid in tissue and fluids. The method utilizes the intrinsic spectroscopic properties of TRP and its metabolites that enable UV absorbance and fluorescence detection by HPLC, without additional labeling. The origin of the peaks related to analytes of interest was confirmed by UV–Vis spectral patterns using a PDA detector and mass spectrometry. The developed methods were validated in rabbit fetal brain and amniotic fluid at gestational day 29. Results are in excellent agreement with those reported in the literature for the same regions. This method allows for rapid quantification of tryptophan and four of its major metabolites concurrently. A change in the relative ratios of these metabolites can provide important insights in predicting the presence and progression of neuroinflammation in disorders such as cerebral palsy, autism, multiple sclerosis, Alzheimer disease, and schizophrenia. PMID:24036037

  10. Ultrasound strain imaging for quantification of tissue function: cardiovascular applications

    NASA Astrophysics Data System (ADS)

    de Korte, Chris L.; Lopata, Richard G. P.; Hansen, Hendrik H. G.

    2013-03-01

    With ultrasound imaging, the motion and deformation of tissue can be measured. Tissue can be deformed by applying a force on it and the resulting deformation is a function of its mechanical properties. Quantification of this resulting tissue deformation to assess the mechanical properties of tissue is called elastography. If the tissue under interrogation is actively deforming, the deformation is directly related to its function and quantification of this deformation is normally referred as `strain imaging'. Elastography can be used for atherosclerotic plaques characterization, while the contractility of the heart or skeletal muscles can be assessed with strain imaging. We developed radio frequency (RF) based ultrasound methods to assess the deformation at higher resolution and with higher accuracy than commercial methods using conventional image data (Tissue Doppler Imaging and 2D speckle tracking methods). However, the improvement in accuracy is mainly achieved when measuring strain along the ultrasound beam direction, so 1D. We further extended this method to multiple directions and further improved precision by using compounding of data acquired at multiple beam steered angles. In arteries, the presence of vulnerable plaques may lead to acute events like stroke and myocardial infarction. Consequently, timely detection of these plaques is of great diagnostic value. Non-invasive ultrasound strain compounding is currently being evaluated as a diagnostic tool to identify the vulnerability of plaques. In the heart, we determined the strain locally and at high resolution resulting in a local assessment in contrary to conventional global functional parameters like cardiac output or shortening fraction.

  11. Quantification of liver fibrosis in chronic hepatitis B virus infection

    PubMed Central

    Jieanu, CF; Ungureanu, BS; Săndulescu, DL; Gheonea, IA; Tudorașcu, DR; Ciurea, ME; Purcărea, VL

    2015-01-01

    Chronic hepatitis B virus infection (HBV) is considered a global public issue with more than 78.000 people per year dying of its evolution. With liver transplantation as the only viable therapeutic option but only in end-stage disease, hepatitis B progression may generally be influenced by various factors. Assessing fibrosis stage plays an important part in future decisions on the patients’ wealth with available antiviral agents capable of preventing fibrosis passing to an end-stage liver disease. Several methods have been taken into consideration as an alternative for HBV quantification status, such as imaging techniques and serum based biomarkers. Magnetic resonance imaging, ultrasound, and elastography are considered non-invasive imaging techniques frequently used to quantify disease progression as well as patients future prognostic. Consequently, both direct and indirect biomarkers have been studied for differentiating between fibrosis stages. This paper reviews the current standings in HBV non-invasive liver fibrosis quantification, presenting the prognostic factors and available assessment procedures that might eventually replace liver biopsy. PMID:26351528

  12. Quantification of HEV RNA by Droplet Digital PCR

    PubMed Central

    Nicot, Florence; Cazabat, Michelle; Lhomme, Sébastien; Marion, Olivier; Sauné, Karine; Chiabrando, Julie; Dubois, Martine; Kamar, Nassim; Abravanel, Florence; Izopet, Jacques

    2016-01-01

    The sensitivity of real-time PCR for hepatitis E virus (HEV) RNA quantification differs greatly among techniques. Standardized tools that measure the real quantity of virus are needed. We assessed the performance of a reverse transcription droplet digital PCR (RT-ddPCR) assay that gives absolute quantities of HEV RNA. Analytical and clinical validation was done on HEV genotypes 1, 3 and 4, and was based on open reading frame (ORF)3 amplification. The within-run and between-run reproducibilities were very good, the analytical sensitivity was 80 HEV RNA international units (IU)/mL and linearities of HEV genotype 1, 3 and 4 were very similar. Clinical validation based on 45 samples of genotype 1, 3 or 4 gave results that correlated well with a validated reverse transcription quantitative PCR (RT-qPCR) assay (Spearman rs = 0.89, p < 0.0001). The RT-ddPCR assay is a sensitive method and could be a promising tool for standardizing HEV RNA quantification in various sample types. PMID:27548205

  13. Accurate quantification of supercoiled DNA by digital PCR.

    PubMed

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-01-01

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry. PMID:27063649

  14. A critical view on microplastic quantification in aquatic organisms.

    PubMed

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J J; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-11-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g(-1) w.w. for the Acid mix Method and 0.12±0.04 total microplastics g(-1) w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g(-1) w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring. PMID:26249746

  15. Applying uncertainty quantification to multiphase flow computational fluid dynamics

    SciTech Connect

    Gel, A; Garg, R; Tong, C; Shahnam, M; Guenther, C

    2013-07-01

    Multiphase computational fluid dynamics plays a major role in design and optimization of fossil fuel based reactors. There is a growing interest in accounting for the influence of uncertainties associated with physical systems to increase the reliability of computational simulation based engineering analysis. The U.S. Department of Energy's National Energy Technology Laboratory (NETL) has recently undertaken an initiative to characterize uncertainties associated with computer simulation of reacting multiphase flows encountered in energy producing systems such as a coal gasifier. The current work presents the preliminary results in applying non-intrusive parametric uncertainty quantification and propagation techniques with NETL's open-source multiphase computational fluid dynamics software MFIX. For this purpose an open-source uncertainty quantification toolkit, PSUADE developed at the Lawrence Livermore National Laboratory (LLNL) has been interfaced with MFIX software. In this study, the sources of uncertainty associated with numerical approximation and model form have been neglected, and only the model input parametric uncertainty with forward propagation has been investigated by constructing a surrogate model based on data-fitted response surface for a multiphase flow demonstration problem. Monte Carlo simulation was employed for forward propagation of the aleatory type input uncertainties. Several insights gained based on the outcome of these simulations are presented such as how inadequate characterization of uncertainties can affect the reliability of the prediction results. Also a global sensitivity study using Sobol' indices was performed to better understand the contribution of input parameters to the variability observed in response variable.

  16. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI

    NASA Astrophysics Data System (ADS)

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-06-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  17. Volumetric loss quantification using ultrasonic inductively coupled transducers

    NASA Astrophysics Data System (ADS)

    Gong, Peng; Hay, Thomas R.; Greve, David W.; Oppenheim, Irving J.

    2015-03-01

    The pulse-echo method is widely used for plate and pipe thickness measurement. However, the pulse echo method does not work well for detecting localized volumetric loss in thick-wall tubes, as created by erosion damage, when the morphology of volumetric loss is irregular and can reflect ultrasonic pulses away from the transducer, making it difficult to detect an echo. In this paper, we propose a novel method using an inductively coupled transducer to generate longitudinal waves propagating in a thick-wall aluminum tube for the volumetric loss quantification. In the experiment, longitudinal waves exhibit diffraction effects during the propagation which can be explained by the Huygens-Fresnel principle. The diffractive waves are also shown to be significantly delayed by the machined volumetric loss on the inside surface of the thick-wall aluminum tube. It is also shown that the inductively coupled transducers can generate and receive similar ultrasonic waves to those from wired transducers, and the inductively coupled transducers perform as well as the wired transducers in the volumetric loss quantification when other conditions are the same.

  18. Quantification of HEV RNA by Droplet Digital PCR.

    PubMed

    Nicot, Florence; Cazabat, Michelle; Lhomme, Sébastien; Marion, Olivier; Sauné, Karine; Chiabrando, Julie; Dubois, Martine; Kamar, Nassim; Abravanel, Florence; Izopet, Jacques

    2016-01-01

    The sensitivity of real-time PCR for hepatitis E virus (HEV) RNA quantification differs greatly among techniques. Standardized tools that measure the real quantity of virus are needed. We assessed the performance of a reverse transcription droplet digital PCR (RT-ddPCR) assay that gives absolute quantities of HEV RNA. Analytical and clinical validation was done on HEV genotypes 1, 3 and 4, and was based on open reading frame (ORF)3 amplification. The within-run and between-run reproducibilities were very good, the analytical sensitivity was 80 HEV RNA international units (IU)/mL and linearities of HEV genotype 1, 3 and 4 were very similar. Clinical validation based on 45 samples of genotype 1, 3 or 4 gave results that correlated well with a validated reverse transcription quantitative PCR (RT-qPCR) assay (Spearman rs = 0.89, p < 0.0001). The RT-ddPCR assay is a sensitive method and could be a promising tool for standardizing HEV RNA quantification in various sample types. PMID:27548205

  19. Simple and inexpensive quantification of ammonia in whole blood.

    PubMed

    Ayyub, Omar B; Behrens, Adam M; Heligman, Brian T; Natoli, Mary E; Ayoub, Joseph J; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μL of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p = 0.0001. PMID:25936660

  20. An on-bacterium flow cytometric immunoassay for protein quantification.

    PubMed

    Lan, Wen-Jun; Lan, Wei; Wang, Hai-Yan; Yan, Lei; Wang, Zhe-Li

    2013-09-01

    The polystyrene bead-based flow cytometric immunoassay has been widely reported. However, the preparation of functional polystyrene bead is still inconvenient. This study describes a simple and easy on-bacterium flow cytometric immunoassay for protein quantification, in which Staphylococcus aureus (SAC) is used as an antibody-antigen carrier to replace the polystyrene bead. The SAC beads were prepared by carboxyfluorescein diacetate succinimidyl ester (CFSE) labeling, paraformaldehyde fixation and antibody binding. Carcinoembryonic antigen (CEA) and cytokeratin-19 fragment (CYFRA 21-1) proteins were used as models in the test system. Using prepared SAC beads, biotinylated proteins, and streptavidin-phycoerythrin (SA-PE), the on-bacterium flow cytometric immunoassay was validated by quantifying CEA and CYFRA 21-1 in sample. Obtained data demonstrated a concordant result between the logarithm of the protein concentration and the logarithm of the PE mean fluorescence intensity (MFI). The limit of detection (LOD) in this immunoassay was at least 0.25 ng/ml. Precision and accuracy assessments appeared that either the relative standard deviation (R.S.D.) or the relative error (R.E.) was <10%. The comparison between this immunoassay and a polystyrene bead-based flow cytometric immunoassay showed a correlation coefficient of 0.998 for serum CEA or 0.996 for serum CYFRA 21-1. In conclusion, the on-bacterium flow cytometric immunoassay may be of use in the quantification of serum protein. PMID:23739299

  1. Is HBsAg quantification ready, for prime time?

    PubMed

    Chevaliez, Stéphane

    2013-12-01

    Despite the availability of an efficient hepatitis B vaccine, approximately 240 million individuals are chronically infected with hepatitis B virus worldwide. One-fourth of hepatitis B surface antigen (HBsAg)-positive patients will develop complications, such as cirrhosis or hepatocellular carcinoma, both major causes of liver-related deaths. Antiviral therapies, such as pegylated interferon alpha or nucleoside/nucleotide analogues, are effective in suppressing HBV DNA and reducing the subsequent risk of fibrosis progression, cirrhosis and hepatocellular carcinoma. HBsAg has proven to be a steady, reliable marker of chronic HBV carriage that can also be used to predict clinical outcomes. Three commercial enzyme immunoassays are now available for HBsAg quantification. A number of recent studies have shown clinical utility of HBsAg quantification in combination with HBV DNA levels to identify inactive carriers who need antiviral therapy and in interferon treated-patients in order to predict the virological response to pegylated interferon alpha. PMID:23932705

  2. Functional error modeling for uncertainty quantification in hydrogeology

    NASA Astrophysics Data System (ADS)

    Josset, L.; Ginsbourger, D.; Lunati, I.

    2015-02-01

    Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

  3. Amperometric quantification based on serial dilution microfluidic systems.

    PubMed

    Stephan, Khaled; Pittet, Patrick; Sigaud, Monique; Renaud, Louis; Vittori, Olivier; Morin, Pierre; Ouaini, Naim; Ferrigno, Rosaria

    2009-03-01

    This paper describes a microfluidic device fabricated in poly(dimethylsiloxane) that was employed to perform amperometric quantifications using on-chip calibration curves and on-chip standard addition methods. This device integrated a network of Au electrodes within a microfluidic structure designed for automatic preparation of a series of solutions containing an electroactive molecule at a concentration linearly decreasing. This device was first characterized by fluorescence microscopy and then evaluated with a model electroactive molecule such as Fe(CN(6))(4-). Operating a quantification in this microfluidic parallel approach rather than in batch mode allows a reduced analysis time to be achieved. Moreover, the microfluidic approach is compatible with the on-chip calibration of sensors simultaneously to the analysis, therefore preventing problems due to sensor response deviation with time. When using the on-chip calibration and on-chip standard addition method, we reached concentration estimation better than 5%. We also demonstrated that compared to the calibration curve approach, the standard addition mode is less complex to operate. Indeed, in this case, it is not necessary to take into account flow rate discrepancies as in the calibration approach. PMID:19238282

  4. Quantification of Methylated Selenium, Sulfur, and Arsenic in the Environment

    PubMed Central

    Vriens, Bas; Ammann, Adrian A.; Hagendorfer, Harald; Lenz, Markus; Berg, Michael; Winkel, Lenny H. E.

    2014-01-01

    Biomethylation and volatilization of trace elements may contribute to their redistribution in the environment. However, quantification of volatile, methylated species in the environment is complicated by a lack of straightforward and field-deployable air sampling methods that preserve element speciation. This paper presents a robust and versatile gas trapping method for the simultaneous preconcentration of volatile selenium (Se), sulfur (S), and arsenic (As) species. Using HPLC-HR-ICP-MS and ESI-MS/MS analyses, we demonstrate that volatile Se and S species efficiently transform into specific non-volatile compounds during trapping, which enables the deduction of the original gaseous speciation. With minor adaptations, the presented HPLC-HR-ICP-MS method also allows for the quantification of 13 non-volatile methylated species and oxyanions of Se, S, and As in natural waters. Application of these methods in a peatland indicated that, at the selected sites, fluxes varied between 190–210 ng Se·m−2·d−1, 90–270 ng As·m−2·d−1, and 4–14 µg S·m−2·d−1, and contained at least 70% methylated Se and S species. In the surface water, methylated species were particularly abundant for As (>50% of total As). Our results indicate that methylation plays a significant role in the biogeochemical cycles of these elements. PMID:25047128

  5. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  6. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  7. Quantification of nerve agent biomarkers in human serum and urine.

    PubMed

    Røen, Bent Tore; Sellevåg, Stig Rune; Lundanes, Elsa

    2014-12-01

    A novel method for rapid and sensitive quantification of the nerve agent metabolites ethyl, isopropyl, isobutyl, cyclohexyl, and pinacolyl methylphosphonic acid has been established by combining salting-out assisted liquid-liquid extraction (SALLE) and online solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS/MS). The procedure allows confirmation of nerve agent exposure within 30 min from receiving a sample, with very low detection limits for the biomarkers of 0.04-0.12 ng/mL. Sample preparation by SALLE was performed in less than 10 min, with a common procedure for both serum and urine. Analyte recoveries of 70-100% were obtained using tetrahydrofuran as extraction solvent and Na2SO4 to achieve phase separation. After SALLE, selective analyte retention was obtained on a ZrO2 column by Lewis acid-base and hydrophilic interactions with acetonitrile/1% CH3COOH (82/18) as the loading mobile phase. The phosphonic acids were backflush-desorbed onto a polymeric zwitterionic column at pH 9.8 and separated by hydrophilic interaction liquid chromatography. The method was linear (R(2) ≥ 0.995) from the limits of quantification to 50 ng/mL, and the within- and between-assay repeatability at 20 ng/mL were below 5% and 10% relative standard deviation, respectively. PMID:25371246

  8. Simple and Inexpensive Quantification of Ammonia in Whole Blood

    PubMed Central

    Ayyub, Omar B.; Behrens, Adam M.; Heligman, Brian T.; Natoli, Mary E.; Ayoub, Joseph J.; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μl of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p=0.0001. PMID:25936660

  9. Generation, Quantification, and Tracing of Metabolically Labeled Fluorescent Exosomes.

    PubMed

    Coscia, Carolina; Parolini, Isabella; Sanchez, Massimo; Biffoni, Mauro; Boussadia, Zaira; Zanetti, Cristiana; Fiani, Maria Luisa; Sargiacomo, Massimo

    2016-01-01

    Over the last 10 years, the constant progression in exosome (Exo)-related studies highlighted the importance of these cell-derived nano-sized vesicles in cell biology and pathophysiology. Functional studies on Exo uptake and intracellular trafficking require accurate quantification to assess sufficient and/or necessary Exo particles quantum able to elicit measurable effects on target cells. We used commercially available BODIPY(®) fatty acid analogues to label a primary melanoma cell line (Me501) that highly and spontaneously secrete nanovesicles. Upon addition to cell culture, BODIPY fatty acids are rapidly incorporated into major phospholipid classes ultimately producing fluorescent Exo as direct result of biogenesis. Our metabolic labeling protocol produced bright fluorescent Exo that can be examined and quantified with conventional non-customized flow cytometry (FC) instruments by exploiting their fluorescent emission rather than light-scattering detection. Furthermore, our methodology permits the measurement of single Exo-associated fluorescence transfer to cells making quantitative the correlation between Exo uptake and activation of cellular processes. Thus the protocol presented here appears as an appropriate tool to who wants to investigate mechanisms of Exo functions in that it allows for direct and rapid characterization and quantification of fluorescent Exo number, intensity, size, and eventually evaluation of their kinetic of uptake/secretion in target cells. PMID:27317184

  10. Accurate quantification of supercoiled DNA by digital PCR

    PubMed Central

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-01-01

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry. PMID:27063649

  11. Dimensionality reduction for uncertainty quantification of nuclear engineering models.

    SciTech Connect

    Roderick, O.; Wang, Z.; Anitescu, M.

    2011-01-01

    The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and efficiency of engineering projects. In our previous work, we have shown that the effect of uncertainty can be approximated by polynomial regression with derivatives (PRD): a hybrid regression method that uses first-order derivatives of the model output as additional fitting conditions for a polynomial expansion. Numerical experiments have demonstrated the advantage of this approach over classical methods of uncertainty analysis: in precision, computational efficiency, or both. To obtain derivatives, we used automatic differentiation (AD) on the simulation code; hand-coded derivatives are acceptable for simpler models. We now present improvements on the method. We use a tuned version of the method of snapshots, a technique based on proper orthogonal decomposition (POD), to set up the reduced order representation of essential information on uncertainty in the model inputs. The automatically obtained sensitivity information is required to set up the method. Dimensionality reduction in combination with PRD allows analysis on a larger dimension of the uncertainty space (>100), at modest computational cost.

  12. 43 CFR 11.73 - Quantification phase-resource recoverability analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 1 2013-10-01 2013-10-01 false Quantification phase-resource recoverability analysis. 11.73 Section 11.73 Public Lands: Interior Office of the Secretary of the Interior NATURAL RESOURCE DAMAGE ASSESSMENTS Type B Procedures § 11.73 Quantification phase—resource recoverability analysis. (a) Requirement. The time needed...

  13. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Procedure for announcing analytical methods for...-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a) FDA may issue an order announcing a specific analytical method or methods for the quantification...

  14. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  15. The quantification of hydrogen and methane in contaminated groundwater: validation of robust procedures for sampling and quantification.

    PubMed

    Dorgerloh, Ute; Becker, Roland; Theissen, Hubert; Nehls, Irene

    2010-10-01

    A number of currently recommended sampling techniques for the determination of hydrogen in contaminated groundwater were compared regarding the practical proficiency in field campaigns. Key characteristics of appropriate sampling procedures are reproducibility of results, robustness against varying field conditions such as hydrostatic pressure, aquifer flow, and biological activity. Laboratory set-ups were used to investigate the most promising techniques. Bubble stripping with gas sampling bulbs yielded reproducible recovery of hydrogen and methane which could be verified for groundwater sampled in two field campaigns. The methane content of the groundwater was confirmed by analysis of directly pumped samples thus supporting the trueness of the stripping results. Laboratory set-ups and field campaigns revealed that bubble stripping of hydrogen may be restricted to the type of used pump. Concentrations of dissolved hydrogen after bubble stripping with an electrically driven submersible pump were about one order of magnitude higher than those obtained from diffusion sampling. The gas chromatographic determination for hydrogen and methane requires manual injection of gas samples and detection by a pulsed discharge detector (PDD) and allows limits of quantification of 3 nM dissolved hydrogen and 1 µg L⁻¹ dissolved methane in groundwater. The combined standard uncertainty of the bubble stripping and GC/PDD quantification of hydrogen in field samples was 7% at 7.8 nM and 18% for 78 nM. PMID:20730246

  16. Proteomics technologies for the global identification and quantification of proteins.

    PubMed

    Brewis, Ian A; Brennan, P

    2010-01-01

    This review provides an introduction for the nonspecialist to proteomics and in particular the major approaches available for global protein identification and quantification. Proteomics technologies offer considerable opportunities for improved biological understanding and biomarker discovery. The central platform for proteomics is tandem mass spectrometry (MS) but a number of other technologies, resources, and expertise are absolutely required to perform meaningful experiments. These include protein separation science (and protein biochemistry in general), genomics, and bioinformatics. There are a range of workflows available for protein (or peptide) separation prior to tandem MS and subsequent bioinformatics analysis to achieve protein identifications. The predominant approaches are 2D electrophoresis (2DE) and subsequent MS, liquid chromatography-MS (LC-MS), and GeLC-MS. Beyond protein identification, there are a number of well-established options available for protein quantification. Difference gel electrophoresis (DIGE) following 2DE is one option but MS-based methods (most commonly iTRAQ-Isobaric Tags for Relative and Absolute Quantification or SILAC-Stable Isotope Labeling by Amino Acids) are now the preferred options. Sample preparation is critical to performing good experiments and subcellular fractionation can additionally provide protein localization information compared with whole cell lysates. Differential detergent solubilization is another valid option. With biological fluids, it is possible to remove the most abundant proteins by immunodepletion. Sample enrichment is also used extensively in certain analyses and most commonly in phosphoproteomics with the initial purification of phosphopeptides. Proteomics produces considerable datasets and resources to facilitate the necessary extended analysis of this data are improving all the time. Beyond the opportunities afforded by proteomics there are definite challenges to achieving full proteomic coverage

  17. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  18. Enhanced techniques for asymmetry quantification in brain imagery

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Imielinska, Celina; Rosiene, Joel; Connolly, E. S.; D'Ambrosio, Anthony L.

    2006-03-01

    We present an automated generic methodology for symmetry identification and asymmetry quantification, novel method of identifying and delineation of brain pathology by analyzing the opposing sides of the brain utilizing of inherent left-right symmetry in the brain. After symmetry axis has been detected, we apply non-parametric statistical tests operating on the pairs of samples to identify initial seeds points which is defined defined as the pixels where the most statistically significant difference appears. Local region growing is performed on the difference map, from where the seeds are aggregating until it captures all 8-way connected high signals from the difference map. We illustrate the capability of our method with examples ranging from tumors in patient MR data to animal stroke data. The validation results on Rat stroke data have shown that this approach has promise to achieve high precision and full automation in segmenting lesions in reflectional symmetrical objects.

  19. Quantification of intracerebral steal in patients with arteriovenous malformation

    SciTech Connect

    Homan, R.W.; Devous, M.D. Sr.; Stokely, E.M.; Bonte, F.J.

    1986-08-01

    Eleven patients with angiographically and/or pathologically proved arteriovenous malformations (AVMs) were studied using dynamic, single-photon-emission computed tomography (DSPECT). Quantification of regional cerebral blood flow in structurally normal areas remote from the AVM disclosed areas of decreased flow compared with normal controls in eight of 11 patients examined. Areas of hypoperfusion correlated with altered function as manifested by epileptogenic foci and impaired cognitive function. Dynamic, single-photon-emission computed tomography provides a noninvasive technique to monitor quantitatively hemodynamic changes associated with AVMs. Our findings suggest that such changes are present in the majority of patients with AVMs and that they may be clinically significant. The potential application of regional cerebral blood flow imaging by DSPECT in the management of patients with AVMs is discussed.

  20. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  1. Segmentation and quantification of adipose tissue by magnetic resonance imaging.

    PubMed

    Hu, Houchun Harry; Chen, Jun; Shen, Wei

    2016-04-01

    In this brief review, introductory concepts in animal and human adipose tissue segmentation using proton magnetic resonance imaging (MRI) and computed tomography are summarized in the context of obesity research. Adipose tissue segmentation and quantification using spin relaxation-based (e.g., T1-weighted, T2-weighted), relaxometry-based (e.g., T1-, T2-, T2*-mapping), chemical-shift selective, and chemical-shift encoded water-fat MRI pulse sequences are briefly discussed. The continuing interest to classify subcutaneous and visceral adipose tissue depots into smaller sub-depot compartments is mentioned. The use of a single slice, a stack of slices across a limited anatomical region, or a whole body protocol is considered. Common image post-processing steps and emerging atlas-based automated segmentation techniques are noted. Finally, the article identifies some directions of future research, including a discussion on the growing topic of brown adipose tissue and related segmentation considerations. PMID:26336839

  2. NeuCode Labels for Relative Protein Quantification *

    PubMed Central

    Merrill, Anna E.; Hebert, Alexander S.; MacGilvray, Matthew E.; Rose, Christopher M.; Bailey, Derek J.; Bradley, Joel C.; Wood, William W.; El Masri, Marwan; Westphall, Michael S.; Gasch, Audrey P.; Coon, Joshua J.

    2014-01-01

    We describe a synthesis strategy for the preparation of lysine isotopologues that differ in mass by as little as 6 mDa. We demonstrate that incorporation of these molecules into the proteomes of actively growing cells does not affect cellular proliferation, and we discuss how to use the embedded mass signatures (neutron encoding (NeuCode)) for multiplexed proteome quantification by means of high-resolution mass spectrometry. NeuCode SILAC amalgamates the quantitative accuracy of SILAC with the multiplexing of isobaric tags and, in doing so, offers up new opportunities for biological investigation. We applied NeuCode SILAC to examine the relationship between transcript and protein levels in yeast cells responding to environmental stress. Finally, we monitored the time-resolved responses of five signaling mutants in a single 18-plex experiment. PMID:24938287

  3. Quantification of tidal parameters from Solar System data

    NASA Astrophysics Data System (ADS)

    Lainey, Valéry

    2016-05-01

    Tidal dissipation is the main driver of orbital evolution of natural satellites and a key point to understand the exoplanetary system configurations. Despite its importance, its quantification from observations still remains difficult for most objects of our own Solar System. In this work, we overview the method that has been used to determine, directly from observations, the tidal parameters, with emphasis on the Love number k_2 and the tidal quality factor Q. Up-to-date values of these tidal parameters are summarized. Last, an assessment on the possible determination of the tidal ratio k_2/Q of Uranus and Neptune is done. This may be particularly relevant for coming astrometric campaigns and future space missions focused on these systems.

  4. A novel definition for quantification of mode shape complexity

    NASA Astrophysics Data System (ADS)

    Koruk, Hasan; Sanliturk, Kenan Y.

    2013-07-01

    Complex mode shapes are quite often encountered in structural dynamics. However, there is no universally accepted parameter for the quantification of mode shape complexity. After reviewing the existing methods, a novel approach is proposed in this paper in order to quantify mode shape complexity for general structures. The new parameter proposed in this paper is based on conservation of energy principle when a structure is vibrating at a specific mode during a period of vibration. The levels of complexity of the individual mode shapes of a sample structure are then quantified using the proposed new parameter and the other parameters available in the literature. The corresponding results are compared, the validity and the generality of the new parameter are demonstrated for various damping scenarios.

  5. Quantification of hydroxyacetone and glycolaldehyde using chemical ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    Spencer, K. M.; Beaver, M. R.; St. Clair, J. M.; Crounse, J. D.; Paulot, F.; Wennberg, P. O.

    2011-08-01

    Chemical ionization mass spectrometry (CIMS) enables online, fast, in situ detection and quantification of hydroxyacetone and glycolaldehyde. Two different CIMS approaches are demonstrated employing the strengths of single quadrupole mass spectrometry and triple quadrupole (tandem) mass spectrometry. Both methods are capable of the measurement of hydroxyacetone, an analyte with minimal isobaric interferences. Tandem mass spectrometry provides direct separation of the isobaric compounds glycolaldehyde and acetic acid using distinct, collision-induced dissociation daughter ions. Measurement of hydroxyacetone and glycolaldehyde by these methods was demonstrated during the ARCTAS-CARB 2008 campaign and the BEARPEX 2009 campaign. Enhancement ratios of these compounds in ambient biomass burning plumes are reported for the ARCTAS-CARB campaign. BEARPEX observations are compared to simple photochemical box model predictions of biogenic volatile organic compound oxidation at the site.

  6. From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches

    PubMed Central

    Potter, Kristin; Rosen, Paul; Johnson, Chris R.

    2014-01-01

    Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly used in visualization. We then look at various approaches to visualizing this uncertainty by partitioning the work based on the dimensionality of the data and the dimensionality of the uncertainty. We also discuss noteworthy exceptions to our taxonomy along with future research directions for the uncertainty visualization community. PMID:25663949

  7. Methods for the efficient quantification of fruit provitamin A contents.

    PubMed

    Davey, Mark W; Keulemans, Johan; Swennen, Rony

    2006-12-15

    As part of a screening program to identify micronutrient-rich banana and plantain (Musa) varieties, a simple, robust, and comparatively rapid protocol for the quantification of the provitamin A carotenoids contents of fruit pulp and peel tissues by HPLC and by spectrophotometry has been developed. Major points to note include the use lyophilisation and extensive tissue disruption procedures to ensure quantitative recoveries, and the avoidance of saponification and/or concentration steps which lead to significant losses of provitamin A carotenoids. The protocol showed excellent reproducibility between replicate extractions, without the need for an internal standard. Application of the methodology demonstrated that Musa fruit pulp has a relatively simple provitamin A carotenoids content, quite different from the overlying peel, and that the proportions of alpha- and beta-carotene are characteristic for each genotype. The protocol was also used to profile the provitamin A carotenoids of several other fruits. PMID:17049540

  8. Dielectrophoretic immobilization of proteins: Quantification by atomic force microscopy.

    PubMed

    Laux, Eva-Maria; Knigge, Xenia; Bier, Frank F; Wenger, Christian; Hölzel, Ralph

    2015-09-01

    The combination of alternating electric fields with nanometer-sized electrodes allows the permanent immobilization of proteins by dielectrophoretic force. Here, atomic force microscopy is introduced as a quantification method, and results are compared with fluorescence microscopy. Experimental parameters, for example the applied voltage and duration of field application, are varied systematically, and the influence on the amount of immobilized proteins is investigated. A linear correlation to the duration of field application was found by atomic force microscopy, and both microscopical methods yield a square dependence of the amount of immobilized proteins on the applied voltage. While fluorescence microscopy allows real-time imaging, atomic force microscopy reveals immobilized proteins obscured in fluorescence images due to low S/N. Furthermore, the higher spatial resolution of the atomic force microscope enables the visualization of the protein distribution on single nanoelectrodes. The electric field distribution is calculated and compared to experimental results with very good agreement to atomic force microscopy measurements. PMID:26010162

  9. Graphene wrinkling induced by monodisperse nanoparticles: facile control and quantification

    PubMed Central

    Vejpravova, Jana; Pacakova, Barbara; Endres, Jan; Mantlikova, Alice; Verhagen, Tim; Vales, Vaclav; Frank, Otakar; Kalbac, Martin

    2015-01-01

    Controlled wrinkling of single-layer graphene (1-LG) at nanometer scale was achieved by introducing monodisperse nanoparticles (NPs), with size comparable to the strain coherence length, underneath the 1-LG. Typical fingerprint of the delaminated fraction is identified as substantial contribution to the principal Raman modes of the 1-LG (G and G’). Correlation analysis of the Raman shift of the G and G’ modes clearly resolved the 1-LG in contact and delaminated from the substrate, respectively. Intensity of Raman features of the delaminated 1-LG increases linearly with the amount of the wrinkles, as determined by advanced processing of atomic force microscopy data. Our study thus offers universal approach for both fine tuning and facile quantification of the graphene topography up to ~60% of wrinkling. PMID:26530787

  10. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.