Science.gov

Sample records for quantification spatialisation vulnerabilite

  1. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  2. Dystrophin quantification

    PubMed Central

    Anthony, Karen; Arechavala-Gomeza, Virginia; Taylor, Laura E.; Vulin, Adeline; Kaminoh, Yuuki; Torelli, Silvia; Feng, Lucy; Janghra, Narinder; Bonne, Gisèle; Beuvin, Maud; Barresi, Rita; Henderson, Matt; Laval, Steven; Lourbakos, Afrodite; Campion, Giles; Straub, Volker; Voit, Thomas; Sewry, Caroline A.; Morgan, Jennifer E.; Flanigan, Kevin M.

    2014-01-01

    Objective: We formed a multi-institution collaboration in order to compare dystrophin quantification methods, reach a consensus on the most reliable method, and report its biological significance in the context of clinical trials. Methods: Five laboratories with expertise in dystrophin quantification performed a data-driven comparative analysis of a single reference set of normal and dystrophinopathy muscle biopsies using quantitative immunohistochemistry and Western blotting. We developed standardized protocols and assessed inter- and intralaboratory variability over a wide range of dystrophin expression levels. Results: Results from the different laboratories were highly concordant with minimal inter- and intralaboratory variability, particularly with quantitative immunohistochemistry. There was a good level of agreement between data generated by immunohistochemistry and Western blotting, although immunohistochemistry was more sensitive. Furthermore, mean dystrophin levels determined by alternative quantitative immunohistochemistry methods were highly comparable. Conclusions: Considering the biological function of dystrophin at the sarcolemma, our data indicate that the combined use of quantitative immunohistochemistry and Western blotting are reliable biochemical outcome measures for Duchenne muscular dystrophy clinical trials, and that standardized protocols can be comparable between competent laboratories. The methodology validated in our study will facilitate the development of experimental therapies focused on dystrophin production and their regulatory approval. PMID:25355828

  3. Scoliosis quantification: an overview

    PubMed Central

    Kawchuk, Greg; McArthur, Ross

    1997-01-01

    Scoliotic curvatures have long been a focus of attention for clinicians and research scientists alike. The study, treatment and ultimately, the prevention of this prevalent health condition are impeded by the absence of an accurate, reliable, convenient and safe method of scoliosis quantification. The purpose of this paper is to provide an overview of the current methods of scoliosis quantification for clinicians who address this condition in their practices.

  4. Quantification of nonclassicality

    NASA Astrophysics Data System (ADS)

    Gehrke, C.; Sperling, J.; Vogel, W.

    2012-11-01

    To quantify single-mode nonclassicality, we start from an operational approach. A positive semidefinite observable is introduced to describe a measurement setup. The quantification is based on the negativity of the normally ordered version of this observable. Perfect operational quantumness corresponds to the quantum-noise-free measurement of the chosen observable. Surprisingly, even moderately squeezed states may exhibit perfect quantumness for a properly designed measurement. The quantification is also considered from an axiomatic viewpoint, based on the algebraic structure of the quantum states and the quantum superposition principle. Basic conclusions from both approaches are consistent with this fundamental principle of the quantum world.

  5. Quantificational logic of context

    SciTech Connect

    Buvac, Sasa

    1996-12-31

    In this paper we extend the Propositional Logic of Context, to the quantificational (predicate calculus) case. This extension is important in the declarative representation of knowledge for two reasons. Firstly, since contexts are objects in the semantics which can be denoted by terms in the language and which can be quantified over, the extension enables us to express arbitrary first-order properties of contexts. Secondly, since the extended language is no longer only propositional, we can express that an arbitrary predicate calculus formula is true in a context. The paper describes the syntax and the semantics of a quantificational language of context, gives a Hilbert style formal system, and outlines a proof of the system`s completeness.

  6. Wrappers, Aspects, Quantification and Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2005-01-01

    Talk overview: Object infrastructure framework (OIF). A system development to simplify building distributed applications by allowing independent implementation of multiple concern. Essence and state of AOP. Trinity. Quantification over events. Current work on a generalized AOP technology.

  7. Nitrogen quantification with SNMS

    NASA Astrophysics Data System (ADS)

    Goschnick, J.; Natzeck, C.; Sommer, M.

    1999-04-01

    Plasma-based secondary neutral mass spectrometry (plasma SNMS) is a powerful analytical method for determining the elemental concentrations of almost any kind of material at low cost by using a cheap quadrupole mass filter. However, a quadrupole-based mass spectrometer is limited to nominal mass resolution. Atomic signals are sometimes superimposed by molecular signals (2 or 3 atomic clusters such as CH +, CH 2+ or metal oxide clusters) and/or intensities of double-charged species. Especially in the case of nitrogen several interferences can impede the quantification. This article reports on methods to recognize and deconvolute superpositions of N + with CH 2+, Li 2+, and Si 2+ at mass 14 D (Debye) occurring during analysis of organic and inorganic substances. The recognition is based on the signal pattern of N +, Li +, CH +, and Si +. The latter serve as indicators for a probable interference of molecular or double-charged species with N on mass 14 D. The subsequent deconvolution use different shapes of atomic and cluster kinetic energy distributions (kEDs) to determine the quantities of the intensity components by a linear fit of N + and non-atomic kEDs obtained from several organic and inorganic standards into the measured kED. The atomic intensity fraction yields a much better nitrogen concentration than the total intensity of mass 14 D after correction.

  8. Quantification of human responses

    NASA Technical Reports Server (NTRS)

    Steinlage, R. C.; Gantner, T. E.; Lim, P. Y. W.

    1992-01-01

    Human perception is a complex phenomenon which is difficult to quantify with instruments. For this reason, large panels of people are often used to elicit and aggregate subjective judgments. Print quality, taste, smell, sound quality of a stereo system, softness, and grading Olympic divers and skaters are some examples of situations where subjective measurements or judgments are paramount. We usually express what is in our mind through language as a medium but languages are limited in available choices of vocabularies, and as a result, our verbalizations are only approximate expressions of what we really have in mind. For lack of better methods to quantify subjective judgments, it is customary to set up a numerical scale such as 1, 2, 3, 4, 5 or 1, 2, 3, ..., 9, 10 for characterizing human responses and subjective judgments with no valid justification except that these scales are easy to understand and convenient to use. But these numerical scales are arbitrary simplifications of the complex human mind; the human mind is not restricted to such simple numerical variations. In fact, human responses and subjective judgments are psychophysical phenomena that are fuzzy entities and therefore difficult to handle by conventional mathematics and probability theory. The fuzzy mathematical approach provides a more realistic insight into understanding and quantifying human responses. This paper presents a method for quantifying human responses and subjective judgments without assuming a pattern of linear or numerical variation for human responses. In particular, quantification and evaluation of linguistic judgments was investigated.

  9. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  10. Statistical Approach to Protein Quantification*

    PubMed Central

    Gerster, Sarah; Kwon, Taejoon; Ludwig, Christina; Matondo, Mariette; Vogel, Christine; Marcotte, Edward M.; Aebersold, Ruedi; Bühlmann, Peter

    2014-01-01

    A major goal in proteomics is the comprehensive and accurate description of a proteome. This task includes not only the identification of proteins in a sample, but also the accurate quantification of their abundance. Although mass spectrometry typically provides information on peptide identity and abundance in a sample, it does not directly measure the concentration of the corresponding proteins. Specifically, most mass-spectrometry-based approaches (e.g. shotgun proteomics or selected reaction monitoring) allow one to quantify peptides using chromatographic peak intensities or spectral counting information. Ultimately, based on these measurements, one wants to infer the concentrations of the corresponding proteins. Inferring properties of the proteins based on experimental peptide evidence is often a complex problem because of the ambiguity of peptide assignments and different chemical properties of the peptides that affect the observed concentrations. We present SCAMPI, a novel generic and statistically sound framework for computing protein abundance scores based on quantified peptides. In contrast to most previous approaches, our model explicitly includes information from shared peptides to improve protein quantitation, especially in eukaryotes with many homologous sequences. The model accounts for uncertainty in the input data, leading to statistical prediction intervals for the protein scores. Furthermore, peptides with extreme abundances can be reassessed and classified as either regular data points or actual outliers. We used the proposed model with several datasets and compared its performance to that of other, previously used approaches for protein quantification in bottom-up mass spectrometry. PMID:24255132

  11. Quantification of wastewater sludge dewatering.

    PubMed

    Skinner, Samuel J; Studer, Lindsay J; Dixon, David R; Hillis, Peter; Rees, Catherine A; Wall, Rachael C; Cavalida, Raul G; Usher, Shane P; Stickland, Anthony D; Scales, Peter J

    2015-10-01

    Quantification and comparison of the dewatering characteristics of fifteen sewage sludges from a range of digestion scenarios are described. The method proposed uses laboratory dewatering measurements and integrity analysis of the extracted material properties. These properties were used as inputs into a model of filtration, the output of which provides the dewatering comparison. This method is shown to be necessary for quantification and comparison of dewaterability as the permeability and compressibility of the sludges varies by up to ten orders of magnitude in the range of solids concentration of interest to industry. This causes a high sensitivity of the dewaterability comparison to the starting concentration of laboratory tests, thus simple dewaterability comparison based on parameters such as the specific resistance to filtration is difficult. The new approach is demonstrated to be robust relative to traditional methods such as specific resistance to filtration analysis and has an in-built integrity check. Comparison of the quantified dewaterability of the fifteen sludges to the relative volatile solids content showed a very strong correlation in the volatile solids range from 40 to 80%. The data indicate that the volatile solids parameter is a strong indicator of the dewatering behaviour of sewage sludges. PMID:26003332

  12. Detection and Quantification of Neurotransmitters in Dialysates

    PubMed Central

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2010-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection). PMID:19575473

  13. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  14. Protein inference: A protein quantification perspective.

    PubMed

    He, Zengyou; Huang, Ting; Liu, Xiaoqing; Zhu, Peijun; Teng, Ben; Deng, Shengchun

    2016-08-01

    In mass spectrometry-based shotgun proteomics, protein quantification and protein identification are two major computational problems. To quantify the protein abundance, a list of proteins must be firstly inferred from the raw data. Then the relative or absolute protein abundance is estimated with quantification methods, such as spectral counting. Until now, most researchers have been dealing with these two processes separately. In fact, the protein inference problem can be regarded as a special protein quantification problem in the sense that truly present proteins are those proteins whose abundance values are not zero. Some recent published papers have conceptually discussed this possibility. However, there is still a lack of rigorous experimental studies to test this hypothesis. In this paper, we investigate the feasibility of using protein quantification methods to solve the protein inference problem. Protein inference methods aim to determine whether each candidate protein is present in the sample or not. Protein quantification methods estimate the abundance value of each inferred protein. Naturally, the abundance value of an absent protein should be zero. Thus, we argue that the protein inference problem can be viewed as a special protein quantification problem in which one protein is considered to be present if its abundance is not zero. Based on this idea, our paper tries to use three simple protein quantification methods to solve the protein inference problem effectively. The experimental results on six data sets show that these three methods are competitive with previous protein inference algorithms. This demonstrates that it is plausible to model the protein inference problem as a special protein quantification task, which opens the door of devising more effective protein inference algorithms from a quantification perspective. The source codes of our methods are available at: http://code.google.com/p/protein-inference/. PMID:26935399

  15. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  16. MAMA Software Features: Visual Examples of Quantification

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-20

    This document shows examples of the results from quantifying objects of certain sizes and types in the software. It is intended to give users a better feel for some of the quantification calculations, and, more importantly, to help users understand the challenges with using a small set of ‘shape’ quantification calculations for objects that can vary widely in shapes and features. We will add more examples to this in the coming year.

  17. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  18. Uncertainty Quantification in Solidification Modelling

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2015-06-01

    Numerical models have been used to simulate solidification processes, to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to a few or single experiments, in which agreement is dependent on both model and experimental uncertainty. As a first step to quantifying the uncertainty in the models, sensitivity and uncertainty analysis were performed on a simple steady state 1D solidification model of continuous casting of weld filler rod. This model includes conduction, advection, and release of latent heat was developed for use in uncertainty quantification in the calculation of the position of the liquidus and solidus and the solidification time. Using this model, a Smolyak sparse grid algorithm constructed a response surface that fit model outputs based on the range of uncertainty in the inputs to the model. The response surface was then used to determine the probability density functions (PDF's) of the model outputs and sensitivities of the inputs. This process was done for a linear fraction solid and temperature relationship, for which there is an analytical solution, and a Scheil relationship. Similar analysis was also performed on a transient 2D model of solidification in a rectangular domain.

  19. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  20. Separation and quantification of microalgal carbohydrates.

    PubMed

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse. PMID:23177152

  1. Carotid intraplaque neovascularization quantification software (CINQS).

    PubMed

    Akkus, Zeynettin; van Burken, Gerard; van den Oord, Stijn C H; Schinkel, Arend F L; de Jong, Nico; van der Steen, Antonius F W; Bosch, Johan G

    2015-01-01

    Intraplaque neovascularization (IPN) is an important biomarker of atherosclerotic plaque vulnerability. As IPN can be detected by contrast enhanced ultrasound (CEUS), imaging-biomarkers derived from CEUS may allow early prediction of plaque vulnerability. To select the best quantitative imaging-biomarkers for prediction of plaque vulnerability, a systematic analysis of IPN with existing and new analysis algorithms is necessary. Currently available commercial contrast quantification tools are not applicable for quantitative analysis of carotid IPN due to substantial motion of the carotid artery, artifacts, and intermittent perfusion of plaques. We therefore developed a specialized software package called Carotid intraplaque neovascularization quantification software (CINQS). It was designed for effective and systematic comparison of sets of quantitative imaging biomarkers. CINQS includes several analysis algorithms for carotid IPN quantification and overcomes the limitations of current contrast quantification tools and existing carotid IPN quantification approaches. CINQS has a modular design which allows integrating new analysis tools. Wizard-like analysis tools and its graphical-user-interface facilitate its usage. In this paper, we describe the concept, analysis tools, and performance of CINQS and present analysis results of 45 plaques of 23 patients. The results in 45 plaques showed excellent agreement with visual IPN scores for two quantitative imaging-biomarkers (The area under the receiver operating characteristic curve was 0.92 and 0.93). PMID:25561454

  2. Quantification of sweat gland innervation

    PubMed Central

    Gibbons, Christopher H.; Illigens, Ben M. W.; Wang, Ningshan; Freeman, Roy

    2009-01-01

    Objective: To evaluate a novel method to quantify the density of nerve fibers innervating sweat glands in healthy control and diabetic subjects, to compare the results to an unbiased stereologic technique, and to identify the relationship to standardized physical examination and patient-reported symptom scores. Methods: Thirty diabetic and 64 healthy subjects had skin biopsies performed at the distal leg and distal and proximal thigh. Nerve fibers innervating sweat glands, stained with PGP 9.5, were imaged by light microscopy. Sweat gland nerve fiber density (SGNFD) was quantified by manual morphometry. As a gold standard, three additional subjects had biopsies analyzed by confocal microscopy using unbiased stereologic quantification. Severity of neuropathy was measured by standardized instruments including the Neuropathy Impairment Score in the Lower Limb (NIS-LL) while symptoms were measured by the Michigan Neuropathy Screening Instrument. Results: Manual morphometry increased with unbiased stereology (r = 0.93, p < 0.01). Diabetic subjects had reduced SGNFD compared to controls at the distal leg (p < 0.001), distal thigh (p < 0.01), and proximal thigh (p < 0.05). The SGNFD at the distal leg of diabetic subjects decreased as the NIS-LL worsened (r = −0.89, p < 0.001) and was concordant with symptoms of reduced sweat production (p < 0.01). Conclusions: We describe a novel method to quantify the density of nerve fibers innervating sweat glands. The technique differentiates groups of patients with mild diabetic neuropathy from healthy control subjects and correlates with both physical examination scores and symptoms relevant to sudomotor dysfunction. This method provides a reliable structural measure of sweat gland innervation that complements the investigation of small fiber neuropathies. GLOSSARY AOI = area of interest; CI = confidence interval; ICC = intraclass correlation coefficient; IENFD = intraepidermal nerve fiber density; IgG = immunoglobulin G; NIS

  3. Tumor Quantification in Clinical Positron Emission Tomography

    PubMed Central

    Bai, Bing; Bading, James; Conti, Peter S

    2013-01-01

    Positron emission tomography (PET) is used extensively in clinical oncology for tumor detection, staging and therapy response assessment. Quantitative measurements of tumor uptake, usually in the form of standardized uptake values (SUVs), have enhanced or replaced qualitative interpretation. In this paper we review the current status of tumor quantification methods and their applications to clinical oncology. Factors that impede quantitative assessment and limit its accuracy and reproducibility are summarized, with special emphasis on SUV analysis. We describe current efforts to improve the accuracy of tumor uptake measurements, characterize overall metabolic tumor burden and heterogeneity of tumor uptake, and account for the effects of image noise. We also summarize recent developments in PET instrumentation and image reconstruction and their impact on tumor quantification. Finally, we offer our assessment of the current development needs in PET tumor quantification, including practical techniques for fully quantitative, pharmacokinetic measurements. PMID:24312151

  4. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  5. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  6. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  7. Cues, quantification, and agreement in language comprehension.

    PubMed

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension. PMID:25987192

  8. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  9. Adaptive Fourier modeling for quantification of tremor.

    PubMed

    Riviere, C N; Reich, S G; Thakor, N V

    1997-06-01

    A new computational method for quantification of tremor, the weighted frequency Fourier linear combiner (WFLC), is presented. This technique rapidly determines the frequency and amplitude of tremor by adjusting its filter weights according to a gradient search method. It provides continual tracking of frequency and amplitude modulations over the course of a test. By quantifying time-varying characteristics, the WFLC assists in correctly interpreting the results of spectral analysis, particularly for recordings exhibiting multiple spectral peaks. It therefore supplements spectral analysis, providing a more accurate picture of tremor than spectral analysis alone. The method has been incorporated into a desktop tremor measurement system to provide clinically useful analysis of tremor recorded during handwriting and drawing using a digitizing tablet. Simulated data clearly demonstrate tracking of variations in frequency and amplitude. Clinical recordings then show specific examples of quantification of time-varying aspects of tremor. PMID:9210577

  10. Uncertainty quantification for porous media flows

    NASA Astrophysics Data System (ADS)

    Christie, Mike; Demyanov, Vasily; Erbas, Demet

    2006-09-01

    Uncertainty quantification is an increasingly important aspect of many areas of computational science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of oil and water through oil reservoirs is an example of a complex system where accuracy in prediction is needed primarily for financial reasons. Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks. This paper examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed data. Machine learning algorithms are used to speed up the identification of regions in parameter space where good matches to observed data can be found.

  11. Uncertainty quantification of effective nuclear interactions

    NASA Astrophysics Data System (ADS)

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    2016-03-01

    We give a brief review on the development of phenomenological NN interactions and the corresponding quantification of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean field calculations through the Skyrme parameters and effective field theory counterterms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different fitting strategies on the light of recent developments.

  12. Whitepaper on Uncertainty Quantification for MPACT

    SciTech Connect

    Williams, Mark L.

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  13. Multiplexed quantification for data-independent acquisition.

    PubMed

    Minogue, Catherine E; Hebert, Alexander S; Rensvold, Jarred W; Westphall, Michael S; Pagliarini, David J; Coon, Joshua J

    2015-03-01

    Data-independent acquisition (DIA) strategies provide a sensitive and reproducible alternative to data-dependent acquisition (DDA) methods for large-scale quantitative proteomic analyses. Unfortunately, DIA methods suffer from incompatibility with common multiplexed quantification methods, specifically stable isotope labeling approaches such as isobaric tags and stable isotope labeling of amino acids in cell culture (SILAC). Here we expand the use of neutron-encoded (NeuCode) SILAC to DIA applications (NeuCoDIA), producing a strategy that enables multiplexing within DIA scans without further convoluting the already complex MS(2) spectra. We demonstrate duplex NeuCoDIA analysis of both mixed-ratio (1:1 and 10:1) yeast and mouse embryo myogenesis proteomes. Analysis of the mixed-ratio yeast samples revealed the strong accuracy and precision of our NeuCoDIA method, both of which were comparable to our established MS(1)-based quantification approach. NeuCoDIA also uncovered the dynamic protein changes that occur during myogenic differentiation, demonstrating the feasibility of this methodology for biological applications. We consequently establish DIA quantification of NeuCode SILAC as a useful and practical alternative to DDA-based approaches. PMID:25621425

  14. Multiplexed Quantification for Data-Independent Acquisition

    PubMed Central

    Minogue, Catherine E.; Hebert, Alexander S.; Rensvold, Jarred W.; Westphall, Michael S.; Pagliarini, David J.; Coon, Joshua J.

    2015-01-01

    Data-independent acquisition (DIA) strategies provide a sensitive and reproducible alternative to data-dependent acquisition (DDA) methods for large-scale quantitative proteomic analyses. Unfortunately, DIA methods suffer from incompatibility with common multiplexed quantification methods, specifically stable isotope labeling approaches such as isobaric tags and stable isotope labeling of amino acids in cell culture (SILAC). Here we expand the use of neutron-encoded (NeuCode) SILAC to DIA applications (NeuCoDIA), producing a strategy that enables multiplexing within DIA scans without further convoluting the already complex MS2 spectra. We demonstrate duplex NeuCoDIA analysis of both mixed-ratio (1:1 and 10:1) yeast and mouse embryo myogenesis proteomes. Analysis of the mixed-ratio yeast samples revealed the strong accuracy and precision of our NeuCoDIA method, both of which were comparable to our established MS1-based quantification approach. NeuCoDIA also uncovered the dynamic protein changes that occur during myogenic differentiation, demonstrating the feasibility of this methodology for biological applications. We consequently establish DIA quantification of NeuCode SILAC as a useful and practical alternative to DDA-based approaches. PMID:25621425

  15. Fluorescence-linked Antigen Quantification (FLAQ) Assay for Fast Quantification of HIV-1 p24Gag

    PubMed Central

    Gesner, Marianne; Maiti, Mekhala; Grant, Robert; Cavrois, Marielle

    2016-01-01

    The fluorescence-linked antigen quantification (FLAQ) assay allows a fast quantification of HIV-1 p24Gag antigen. Viral supernatant are lysed and incubated with polystyrene microspheres coated with polyclonal antibodies against HIV-1 p24Gag and detector antibodies conjugated to fluorochromes (Figure 1). After washes, the fluorescence of microspheres is measured by flow cytometry and reflects the abundance of the antigen in the lysate. The speed, simplicity, and wide dynamic range of the FLAQ assay are optimum for many applications performed in HIV-1 research laboratories.

  16. Development of a VHH-Based Erythropoietin Quantification Assay.

    PubMed

    Kol, Stefan; Kallehauge, Thomas Beuchert; Adema, Simon; Hermans, Pim

    2015-08-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification of EPO in a high-throughput setting. PMID:25764454

  17. QconCAT: Internal Standard for Protein Quantification.

    PubMed

    Scott, Kerry Bauer; Turko, Illarion V; Phinney, Karen W

    2016-01-01

    Protein quantification based on stable isotope labeling-mass spectrometry involves adding known quantities of stable isotope-labeled internal standards into biological samples. The internal standards are analogous to analyte molecules and quantification is achieved by comparing signals from isotope-labeled and analyte molecules. This methodology is broadly applicable to proteomics research, biomarker discovery and validation, and clinical studies, which require accurate and precise protein abundance measurements. One such internal standard platform for protein quantification is concatenated peptides (QconCAT). This chapter describes a protocol for the design, expression, characterization, and application of the QconCAT strategy for protein quantification. PMID:26791984

  18. QUANTIFICATION OF TISSUE PROPERTIES IN SMALL VOLUMES

    SciTech Connect

    J. MOURANT; ET AL

    2000-12-01

    The quantification of tissue properties by optical measurements will facilitate the development of noninvasive methods of cancer diagnosis and detection. Optical measurements are sensitive to tissue structure which is known to change during tumorigenesis. The goals of the work presented in this paper were to verify that the primary scatterers of light in cells are structures much smaller than the nucleus and then to develop an optical technique that can quantify parameters of structures the same size as the scattering features in cells. Polarized, elastic back-scattering was found to be able to quantify changes in scattering properties for turbid media consisting of scatterers of the size found in tissue.

  19. Tutorial examples for uncertainty quantification methods.

    SciTech Connect

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  20. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  1. Centerline optimization using vessel quantification model

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Dachille, Frank; Meissner, Michael

    2005-04-01

    An accurate and reproducible centerline is needed in many vascular applications, such as virtual angioscopy, vessel quantification, and surgery planning. This paper presents a progressive optimization algorithm to refine a centerline after it is extracted. A new centerline model definition is proposed that allows quantifiable minimum cross-sectional area. A centerline is divided into a number of segments. Each segment corresponds to a local generalized cylinder. A reference frame (cross-section) is set up at the center point of each cylinder. The position and the orientation of the cross-section are optimized within each cylinder by finding the minimum cross-sectional area. All local-optimized center points are approximated by a NURBS curve globally, and the curve is re-sampled to the refined set of center points. This refinement iteration, local optimization plus global approximation, converges to the optimal centerline, yielding a smooth and accurate central axis curve. The application discussed in this paper is vessel quantification and virtual angioscopy. However, the algorithm is a general centerline refinement method that can be applied to other applications that need accurate and reproducible centerlines.

  2. Virus detection and quantification using electrical parameters

    PubMed Central

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-01-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles. PMID:25355078

  3. Quantification of ontogenetic allometry in ammonoids.

    PubMed

    Korn, Dieter

    2012-01-01

    Ammonoids are well-known objects used for studies on ontogeny and phylogeny, but a quantification of ontogenetic change has not yet been carried out. Their planispirally coiled conchs allow for a study of "longitudinal" ontogenetic data, that is data of ontogenetic trajectories that can be obtained from a single specimen. Therefore, they provide a good model for ontogenetic studies of geometry in other shelled organisms. Using modifications of three cardinal conch dimensions, computer simulations can model artificial conchs. The trajectories of ontogenetic allometry of these simulations can be analyzed in great detail in a theoretical morphospace. A method for the classification of conch ontogeny and quantification of the degree of allometry is proposed. Using high-precision cross-sections, the allometric conch growth of real ammonoids can be documented and compared. The members of the Ammonoidea show a wide variety of allometric growth, ranging from near isometry to monophasic, biphasic, or polyphasic allometry. Selected examples of Palaeozoic and Mesozoic ammonoids are shown with respect to their degree of change during ontogeny of the conch. PMID:23134208

  4. Quantification noise in single cell experiments

    PubMed Central

    Reiter, M.; Kirchner, B.; Müller, H.; Holzhauer, C.; Mann, W.; Pfaffl, M. W.

    2011-01-01

    In quantitative single-cell studies, the critical part is the low amount of nucleic acids present and the resulting experimental variations. In addition biological data obtained from heterogeneous tissue are not reflecting the expression behaviour of every single-cell. These variations can be derived from natural biological variance or can be introduced externally. Both have negative effects on the quantification result. The aim of this study is to make quantitative single-cell studies more transparent and reliable in order to fulfil the MIQE guidelines at the single-cell level. The technical variability introduced by RT, pre-amplification, evaporation, biological material and qPCR itself was evaluated by using RNA or DNA standards. Secondly, the biological expression variances of GAPDH, TNFα, IL-1β, TLR4 were measured by mRNA profiling experiment in single lymphocytes. The used quantification setup was sensitive enough to detect single standard copies and transcripts out of one solitary cell. Most variability was introduced by RT, followed by evaporation, and pre-amplification. The qPCR analysis and the biological matrix introduced only minor variability. Both conducted studies impressively demonstrate the heterogeneity of expression patterns in individual cells and showed clearly today's limitation in quantitative single-cell expression analysis. PMID:21745823

  5. Simple quantification of in planta fungal biomass.

    PubMed

    Ayliffe, Michael; Periyannan, Sambasivam K; Feechan, Angela; Dry, Ian; Schumann, Ulrike; Lagudah, Evans; Pryor, Anthony

    2014-01-01

    An accurate assessment of the disease resistance status of plants to fungal pathogens is an essential requirement for the development of resistant crop plants. Many disease resistance phenotypes are partial rather than obvious immunity and are frequently scored using subjective qualitative estimates of pathogen development or plant disease symptoms. Here we report a method for the accurate comparison of total fungal biomass in plant tissues. This method, called the WAC assay, is based upon the specific binding of the plant lectin wheat germ agglutinin to fungal chitin. The assay is simple, high-throughput, and sensitive enough to discriminate between single Puccinia graminis f.sp tritici infection sites on a wheat leaf segment. It greatly lends itself to replication as large volumes of tissue can be pooled from independent experiments and assayed to provide truly representative quantification, or, alternatively, fungal growth on a single, small leaf segment can be quantified. In addition, as the assay is based upon a microscopic technique, pathogen infection sites can also be examined at high magnification prior to quantification if desired and average infection site areas are determined. Previously, we have demonstrated the application of the WAC assay for quantifying the growth of several different pathogen species in both glasshouse grown material and large-scale field plots. Details of this method are provided within. PMID:24643560

  6. Concurrent quantification of multiple nanoparticle bound states

    PubMed Central

    Rauwerdink, Adam M.; Weaver, John B.

    2011-01-01

    Purpose: The binding of nanoparticles to in vivo targets impacts their use for medical imaging, therapy, and the study of diseases and disease biomarkers. Though an array of techniques can detect binding in vitro, the search for a robust in vivo method continues. The spectral response of magnetic nanoparticles can be influenced by a variety of changes in their physical environment including viscosity and binding. Here, the authors show that nanoparticles in these different environmental states produce spectral responses, which are sufficiently unique to allow for simultaneous quantification of the proportion of nanoparticles within each state. Methods: The authors measured the response to restricted Brownian motion using an array of magnetic nanoparticle designs. With a chosen optimal particle type, the authors prepared particle samples in three distinct environmental states. Various combinations of particles within these three states were measured concurrently and the authors attempted to solve for the quantity of particles within each physical state. Results: The authors found the spectral response of the nanoparticles to be sufficiently unique to allow for accurate quantification of up to three bound states with errors on the order of 1.5%. Furthermore, the authors discuss numerous paths for translating these measurements to in vivo applications. Conclusions: Multiple nanoparticle environmental states can be concurrently quantified using the spectral response of the particles. Such an ability, if translated to the in vivo realm, could provide valuable information about the fate of nanoparticles in vivo or improve the efficacy of nanoparticle based treatments. PMID:21520825

  7. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. PMID:26471520

  8. Survey and Evaluate Uncertainty Quantification Methodologies

    SciTech Connect

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  9. Feature isolation and quantification of evolving datasets

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Identifying and isolating features is an important part of visualization and a crucial step for the analysis and understanding of large time-dependent data sets (either from observation or simulation). In this proposal, we address these concerns, namely the investigation and implementation of basic 2D and 3D feature based methods to enhance current visualization techniques and provide the building blocks for automatic feature recognition, tracking, and correlation. These methods incorporate ideas from scientific visualization, computer vision, image processing, and mathematical morphology. Our focus is in the area of fluid dynamics, and we show the applicability of these methods to the quantification and tracking of three-dimensional vortex and turbulence bursts.

  10. Quantification of Osteon Morphology Using Geometric Histomorphometrics.

    PubMed

    Dillon, Scott; Cunningham, Craig; Felts, Paul

    2016-03-01

    Many histological methods in forensic anthropology utilize combinations of traditional histomorphometric parameters which may not accurately describe the morphology of microstructural features. Here, we report the novel application of a geometric morphometric method suitable when considering structures without anatomically homologous landmarks for the quantification of complete secondary osteon size and morphology. The method is tested for its suitability in the measurement of intact secondary osteons using osteons digitized from transverse femoral diaphyseal sections prepared from two human individuals. The results of methodological testing demonstrate the efficacy of the technique when applied to intact secondary osteons. In providing accurate characterization of micromorphology within the robust mathematical framework of geometric morphometrics, this method may surpass traditional histomorphometric variables currently employed in forensic research and practice. A preliminary study of the intersectional histomorphometric variation within the femoral diaphysis is made using this geometric histomorphometric method to demonstrate its potential. PMID:26478136

  11. Quantification of diacylglycerol by mass spectrometry.

    PubMed

    vom Dorp, Katharina; Dombrink, Isabel; Dörmann, Peter

    2013-01-01

    Diacylglycerol (DAG) is an important intermediate of lipid metabolism and a component of phospholipase C signal transduction. Quantification of DAG in plant membranes represents a challenging task because of its low abundance. DAG can be measured by direct infusion mass spectrometry (MS) on a quadrupole time-of-flight mass spectrometer after purification from the crude plant lipid extract via solid-phase extraction on silica columns. Different internal standards are employed to compensate for the dependence of the MS and MS/MS signals on the chain length and the presence of double bonds in the acyl moieties. Thus, using a combination of single MS and MS/MS experiments, quantitative results for the different molecular species of DAGs from Arabidopsis can be obtained. PMID:23681522

  12. Uncertainty quantification in DIC with Kriging regression

    NASA Astrophysics Data System (ADS)

    Wang, Dezhi; DiazDelaO, F. A.; Wang, Weizhuo; Lin, Xiaoshan; Patterson, Eann A.; Mottershead, John E.

    2016-03-01

    A Kriging regression model is developed as a post-processing technique for the treatment of measurement uncertainty in classical subset-based Digital Image Correlation (DIC). Regression is achieved by regularising the sample-point correlation matrix using a local, subset-based, assessment of the measurement error with assumed statistical normality and based on the Sum of Squared Differences (SSD) criterion. This leads to a Kriging-regression model in the form of a Gaussian process representing uncertainty on the Kriging estimate of the measured displacement field. The method is demonstrated using numerical and experimental examples. Kriging estimates of displacement fields are shown to be in excellent agreement with 'true' values for the numerical cases and in the experimental example uncertainty quantification is carried out using the Gaussian random process that forms part of the Kriging model. The root mean square error (RMSE) on the estimated displacements is produced and standard deviations on local strain estimates are determined.

  13. Carotenoid Extraction and Quantification from Capsicum annuum

    PubMed Central

    Richins, Richard D.; Kilcrease, James; Rodgriguez-Uribe, Laura; O'Connell, Mary A.

    2016-01-01

    Carotenoids are ubiquitous pigments that play key roles in photosynthesis and also accumulate to high levels in fruit and flowers. Specific carotenoids play essential roles in human health as these compounds are precursors for Vitamin A; other specific carotenoids are important sources of macular pigments and all carotenoids are important anti-oxidants. Accurate determination of the composition and concentration of this complex set of natural products is therefore important in many different scientific areas. One of the richest sources of these compounds is the fruit of Capsicum; these red, yellow and orange fruit accumulate multiple carotenes and xanthophylls. This report describes the detailed method for the extraction and quantification of specific carotenes and xanthophylls.

  14. Quantification of adipose tissue insulin sensitivity.

    PubMed

    Søndergaard, Esben; Jensen, Michael D

    2016-06-01

    In metabolically healthy humans, adipose tissue is exquisitely sensitive to insulin. Similar to muscle and liver, adipose tissue lipolysis is insulin resistant in adults with central obesity and type 2 diabetes. Perhaps uniquely, however, insulin resistance in adipose tissue may directly contribute to development of insulin resistance in muscle and liver because of the increased delivery of free fatty acids to those tissues. It has been hypothesized that insulin adipose tissue resistance may precede other metabolic defects in obesity and type 2 diabetes. Therefore, precise and reproducible quantification of adipose tissue insulin sensitivity, in vivo, in humans, is an important measure. Unfortunately, no consensus exists on how to determine adipose tissue insulin sensitivity. We review the methods available to quantitate adipose tissue insulin sensitivity and will discuss their strengths and weaknesses. PMID:27073214

  15. Quantification of Glutathione in Caenorhabditis elegans

    PubMed Central

    Caito, Samuel W.; Aschner, Michael

    2015-01-01

    Glutathione (GSH) is the most abundant intracellular thiol with diverse functions from redox signaling, xenobiotic detoxification, and apoptosis. The quantification of GSH is an important measure for redox capacity and oxidative stress. This protocol quantifies total GSH from Caenorhabditis elegans, an emerging model organism for toxicology studies. GSH is measured using the 5,5′-dithiobis-(2-nitrobenzoic acid) (DTNB) cycling method originally created for cell and tissue samples but optimized for whole worm extracts. DTNB reacts with GSH to from a 5′-thio-2-nitrobenzoic acid (TNB) chromophore with maximum absorbance of 412 nm. This method is both rapid and sensitive, making it ideal for studies involving a large number of transgenic nematode strains. PMID:26309452

  16. Recurrence quantification analysis of global stock markets

    NASA Astrophysics Data System (ADS)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  17. Multispectral image analysis for algal biomass quantification.

    PubMed

    Murphy, Thomas E; Macon, Keith; Berberoglu, Halil

    2013-01-01

    This article reports a novel multispectral image processing technique for rapid, noninvasive quantification of biomass concentration in attached and suspended algae cultures. Monitoring the biomass concentration is critical for efficient production of biofuel feedstocks, food supplements, and bioactive chemicals. Particularly, noninvasive and rapid detection techniques can significantly aid in providing delay-free process control feedback in large-scale cultivation platforms. In this technique, three-band spectral images of Anabaena variabilis cultures were acquired and separated into their red, green, and blue components. A correlation between the magnitude of the green component and the areal biomass concentration was generated. The correlation predicted the biomass concentrations of independently prepared attached and suspended cultures with errors of 7 and 15%, respectively, and the effect of varying lighting conditions and background color were investigated. This method can provide necessary feedback for dilution and harvesting strategies to maximize photosynthetic conversion efficiency in large-scale operation. PMID:23554374

  18. Quantification of variability in trichome patterns

    PubMed Central

    Greese, Bettina; Hülskamp, Martin; Fleck, Christian

    2014-01-01

    While pattern formation is studied in various areas of biology, little is known about the noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to, e.g., the abundance of cell components or environmental conditions. To elevate the understanding of regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches toward characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability. PMID:25431575

  19. Quantification of heterogeneity observed in medical images

    PubMed Central

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. Methods In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. Results We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. Conclusions These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity. PMID:23453000

  20. Uncertainty Quantification of Equilibrium Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Brandon, S. T.; Covey, C. C.; Domyancic, D. M.; Johannesson, G.; Klein, R.; Tannahill, J.; Zhang, Y.

    2011-12-01

    Significant uncertainties exist in the temperature response of the climate system to changes in the levels of atmospheric carbon dioxide. We report progress to quantify the uncertainties of equilibrium climate sensitivity using perturbed parameter ensembles of the Community Earth System Model (CESM). Through a strategic initiative at the Lawrence Livermore National Laboratory, we have been developing uncertainty quantification (UQ) methods and incorporating them into a software framework called the UQ Pipeline. We have applied this framework to generate a large number of ensemble simulations using Latin Hypercube and other schemes to sample up to three dozen uncertain parameters in the atmospheric (CAM) and sea ice (CICE) model components of CESM. The parameters sampled are related to many highly uncertain processes, including deep and shallow convection, boundary layer turbulence, cloud optical and microphysical properties, and sea ice albedo. An extensive ensemble database comprised of more than 46,000 simulated climate-model-years of recent climate conditions has been assembled. This database is being used to train surrogate models of CESM responses and to perform statistical calibrations of the CAM and CICE models given observational data constraints. The calibrated models serve as a basis for propagating uncertainties forward through climate change simulations using a slab ocean model configuration of CESM. This procedure is being used to quantify the probability density function of equilibrium climate sensitivity accounting for uncertainties in climate model processes. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013. (LLNL-ABS-491765)

  1. Kinetic quantification of plyometric exercise intensity.

    PubMed

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program. PMID:22080319

  2. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  3. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2008-09-01

    This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a “black box” approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. This approach requires large number of simulation runs and therefore has high computational cost. Contrary to the “black box” method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In this approach equations for the propagation of uncertainty are constructed and the sensitivity is solved for as variables in the same simulation. This “glass box” method can generate similar sensitivity information as the above “black box” approach with couples of runs to cover a large uncertainty region. Because only small numbers of runs are required, those runs can be done with a high accuracy in space and time ensuring that the uncertainty of the physical model is being measured and not simply the numerical error caused by the coarse discretization. In the forward sensitivity method, the model is differentiated with respect to each parameter to yield an additional system of the same size as the original one, the result of which is the solution sensitivity. The sensitivity of any output variable can then be directly obtained from these sensitivities by applying the chain rule of differentiation. We extend the forward sensitivity method to include time and spatial steps as special parameters so that the numerical errors can be quantified against other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty analysis. By knowing the relative sensitivity of time and space steps with other

  4. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  5. Quantification of ecotoxicological tests based on bioluminescence using Polaroid film.

    PubMed

    Tamminen, Manu V; Virta, Marko P J

    2007-01-01

    Assays based on the measurement of bacterial luminescence are widely used in ecotoxicology. Bacterial strains responding either to general toxicity or specific pollutants are rapid, cost-effective and easy to use. However, quantification of the signal requires relatively expensive instrumentation. We show here that the detection of luminescence of BioTox, a Vibrio fischeri-based toxicity test, and of a specific recombinant bacterial strain for arsenic determination, is possible using common Polaroid film. The exposed films can be used for visual or computer-assisted quantification of the signal. Qualitative visual comparison to standards can be used in the rapid and relatively accurate estimation of toxicity or pollutant concentration. The computer-assisted method significantly improves the accuracy and quantification of the results. The results obtained by computer-assisted quantification were in good agreement with the values obtained with a luminometer. PMID:16949132

  6. Software-assisted serum metabolite quantification using NMR.

    PubMed

    Jung, Young-Sang; Hyeon, Jin-Seong; Hwang, Geum-Sook

    2016-08-31

    The goal of metabolomics is to analyze a whole metabolome under a given set of conditions, and accurate and reliable quantitation of metabolites is crucial. Absolute concentration is more valuable than relative concentration; however, the most commonly used method in NMR-based serum metabolic profiling, bin-based and full data point peak quantification, provides relative concentration levels of metabolites and are not reliable when metabolite peaks overlap in a spectrum. In this study, we present the software-assisted serum metabolite quantification (SASMeQ) method, which allows us to identify and quantify metabolites in NMR spectra using Chenomx software. This software uses the ERETIC2 utility from TopSpin to add a digitally synthesized peak to a spectrum. The SASMeQ method will advance NMR-based serum metabolic profiling by providing an accurate and reliable method for absolute quantification that is superior to bin-based quantification. PMID:27506360

  7. Experimental quantification of the tactile spatial responsivity of human cornea.

    PubMed

    Beiderman, Yevgeny; Belkin, Michael; Rotenstreich, Ygal; Zalevsky, Zeev

    2015-01-01

    We present the first experimental quantification of the tactile spatial responsivity of the cornea and we teach a subject to recognize spatial tactile shapes that are stimulated on their cornea. PMID:26158088

  8. Neutron-encoded mass signatures for multiplexed proteome quantification.

    PubMed

    Hebert, Alexander S; Merrill, Anna E; Bailey, Derek J; Still, Amelia J; Westphall, Michael S; Strieter, Eric R; Pagliarini, David J; Coon, Joshua J

    2013-04-01

    We describe a protein quantification method called neutron encoding that exploits the subtle mass differences caused by nuclear binding energy variation in stable isotopes. These mass differences are synthetically encoded into amino acids and incorporated into yeast and mouse proteins via metabolic labeling. Mass spectrometry analysis with high mass resolution (>200,000) reveals the isotopologue-embedded peptide signals, permitting quantification. Neutron encoding will enable highly multiplexed proteome analysis with excellent dynamic range and accuracy. PMID:23435260

  9. Quantification of extracellular UDP-galactose

    PubMed Central

    Lazarowski, Eduardo R.

    2009-01-01

    The human P2Y14 receptor is potently activated by UDP-glucose (UDP-Glc), UDP-galactose (UDP-Gal), UDP-N-acetylglucosamine (UDP-GlcNAc), and UDP-glucuronic acid. Recently, cellular release of UDP-Glc and UDP-GlcNAc has been reported, but whether additional UDP-sugars are endogenous agonists for the P2Y14 receptor remains poorly defined. In the present study, we describe an assay for the quantification of UDP-Gal with sub-nanomolar sensitivity. This assay is based on the enzymatic conversion of UDP-Gal to UDP, using 1–4-β-galactosyltransferase. UDP is subsequently phosphorylated by nucleoside diphosphokinase in the presence of [γ32P]ATP and the formation of [γ32P]UTP is monitored by high performance liquid chromatography. The overall conversion of UDP-Gal to [γ32P]UTP was linear between 0.5 and 30 nM UDP-Gal. Extracellular UDP-Gal was detected on resting cultures of various cell types, and increased release of UDP-Gal was observed in 1321N1 human astrocytoma cells stimulated with the protease-activated receptor agonist thrombin. Occurrence of regulated release of UDP-Gal suggests that, in addition to its role in glycosylation reactions, UDP-Gal is an important extracellular signaling molecule. PMID:19699703

  10. Classification and quantification of leaf curvature

    PubMed Central

    Liu, Zhongyuan; Jia, Liguo; Mao, Yanfei; He, Yuke

    2010-01-01

    Various mutants of Arabidopsis thaliana deficient in polarity, cell division, and auxin response are characterized by certain types of leaf curvature. However, comparison of curvature for clarification of gene function can be difficult without a quantitative measurement of curvature. Here, a novel method for classification and quantification of leaf curvature is reported. Twenty-two mutant alleles from Arabidopsis mutants and transgenic lines deficient in leaf flatness were selected. The mutants were classified according to the direction, axis, position, and extent of leaf curvature. Based on a global measure of whole leaves and a local measure of four regions in the leaves, the curvature index (CI) was proposed to quantify the leaf curvature. The CI values accounted for the direction, axis, position, and extent of leaf curvature in all of the Arabidopsis mutants grown in growth chambers. Comparison of CI values between mutants reveals the spatial and temporal variations of leaf curvature, indicating the strength of the mutant alleles and the activities of the corresponding genes. Using the curvature indices, the extent of curvature in a complicated genetic background becomes quantitative and comparable, thus providing a useful tool for defining the genetic components of leaf development and to breed new varieties with leaf curvature desirable for the efficient capture of sunlight for photosynthesis and high yields. PMID:20400533

  11. Quantification of the vocal folds’ dynamic displacements

    NASA Astrophysics Data System (ADS)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  12. Uncertainty quantification for systems of conservation laws

    SciTech Connect

    Poette, Gael Despres, Bruno Lucor, Didier

    2009-04-20

    Uncertainty quantification through stochastic spectral methods has been recently applied to several kinds of non-linear stochastic PDEs. In this paper, we introduce a formalism based on kinetic theory to tackle uncertain hyperbolic systems of conservation laws with Polynomial Chaos (PC) methods. The idea is to introduce a new variable, the entropic variable, in bijection with our vector of unknowns, which we develop on the polynomial basis: by performing a Galerkin projection, we obtain a deterministic system of conservation laws. We state several properties of this deterministic system in the case of a general uncertain system of conservation laws. We then apply the method to the case of the inviscid Burgers' equation with random initial conditions and we present some preliminary results for the Euler system. We systematically compare results from our new approach to results from the stochastic Galerkin method. In the vicinity of discontinuities, the new method bounds the oscillations due to Gibbs phenomenon to a certain range through the entropy of the system without the use of any adaptative random space discretizations. It is found to be more precise than the stochastic Galerkin method for smooth cases but above all for discontinuous cases.

  13. Quantification of rigidity in Parkinson's disease.

    PubMed

    Sepehri, Behrooz; Esteki, Ali; Ebrahimi-Takamjani, Esmaeal; Shahidi, Golam-Ali; Khamseh, Fatemeh; Moinodin, Marzieh

    2007-12-01

    In this paper, a new method for quantification of rigidity in elbow joint of Parkinsonian patients is introduced. One of the most known syndromes in Parkinson's disease (PD) is increased passive stiffness in muscles, which leads to rigidity in joints. Clinical evaluation of stiffness in wrist and/or elbow, commonly used by clinicians, is based on Unified Parkinson's Disease Rating System (UPDRS). Subjective nature of this method may influence the accuracy and precision of evaluations. Hence, introducing an objective standard method based on quantitative measurements may be helpful. A test rig was designed and fabricated to measure range of motion and viscous and elastic components of passive stiffness in elbow joint. Measurements were done for 41 patients and 11 controls. Measures were extracted using Matlab-R14 software and statistic analyses were done by Spss-13. Relation between each computed measure and the level of illness were analyzed. Results showed a better correlation between viscous component of stiffness and UPDRS score compared to the elastic component. Results of this research may help to introduce a standard objective method for evaluation of PD. PMID:17909970

  14. Quantification of moving target cyber defenses

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; Cybenko, George

    2015-05-01

    Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.

  15. Benchmarking RNA-Seq quantification tools

    PubMed Central

    Chandramohan, R.; Wu, Po-Yen; Phan, J.H.; Wang, M.D.

    2016-01-01

    RNA-Seq, a deep sequencing technique, promises to be a potential successor to microarraysfor studying the transcriptome. One of many aspects of transcriptomics that are of interest to researchers is gene expression estimation. With rapid development in RNA-Seq, there are numerous tools available to estimate gene expression, each producing different results. However, we do not know which of these tools produces the most accurate gene expression estimates. In this study we have addressed this issue using Cufflinks, IsoEM, HTSeq, and RSEM to quantify RNA-Seq expression profiles. Comparing results of these quantification tools, we observe that RNA-Seq relative expression estimates correlate with RT-qPCR measurements in the range of 0.85 to 0.89, with HTSeq exhibiting the highest correlation. But, in terms of root-mean-square deviation of RNA-Seq relative expression estimates from RT-qPCR measurements, we find HTSeq to produce the greatest deviation. Therefore, we conclude that, though Cufflinks, RSEM, and IsoEM might not correlate as well as HTSeq with RT-qPCR measurements, they may produce expression values with higher accuracy. PMID:24109770

  16. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  17. Quantification of biological aging in young adults

    PubMed Central

    Belsky, Daniel W.; Caspi, Avshalom; Houts, Renate; Cohen, Harvey J.; Corcoran, David L.; Danese, Andrea; Harrington, HonaLee; Israel, Salomon; Levine, Morgan E.; Schaefer, Jonathan D.; Sugden, Karen; Williams, Ben; Yashin, Anatoli I.; Poulton, Richie; Moffitt, Terrie E.

    2015-01-01

    Antiaging therapies show promise in model organism research. Translation to humans is needed to address the challenges of an aging global population. Interventions to slow human aging will need to be applied to still-young individuals. However, most human aging research examines older adults, many with chronic disease. As a result, little is known about aging in young humans. We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their “biological aging” (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older. Measured biological aging in young adults can be used to identify causes of aging and evaluate rejuvenation therapies. PMID:26150497

  18. Shape regression for vertebra fracture quantification

    NASA Astrophysics Data System (ADS)

    Lund, Michael Tillge; de Bruijne, Marleen; Tanko, Laszlo B.; Nielsen, Mads

    2005-04-01

    Accurate and reliable identification and quantification of vertebral fractures constitute a challenge both in clinical trials and in diagnosis of osteoporosis. Various efforts have been made to develop reliable, objective, and reproducible methods for assessing vertebral fractures, but at present there is no consensus concerning a universally accepted diagnostic definition of vertebral fractures. In this project we want to investigate whether or not it is possible to accurately reconstruct the shape of a normal vertebra, using a neighbouring vertebra as prior information. The reconstructed shape can then be used to develop a novel vertebra fracture measure, by comparing the segmented vertebra shape with its reconstructed normal shape. The vertebrae in lateral x-rays of the lumbar spine were manually annotated by a medical expert. With this dataset we built a shape model, with equidistant point distribution between the four corner points. Based on the shape model, a multiple linear regression model of a normal vertebra shape was developed for each dataset using leave-one-out cross-validation. The reconstructed shape was calculated for each dataset using these regression models. The average prediction error for the annotated shape was on average 3%.

  19. Damage detection using multivariate recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Nichols, J. M.; Trickey, S. T.; Seaver, M.

    2006-02-01

    Recurrence-quantification analysis (RQA) has emerged as a useful tool for detecting subtle non-stationarities and/or changes in time-series data. Here, we extend the RQA analysis methods to multivariate observations and present a method by which the "length scale" parameter ɛ (the only parameter required for RQA) may be selected. We then apply the technique to the difficult engineering problem of damage detection. The structure considered is a finite element model of a rectangular steel plate where damage is represented as a cut in the plate, starting at one edge and extending from 0% to 25% of the plate width in 5% increments. Time series, recorded at nine separate locations on the structure, are used to reconstruct the phase space of the system's dynamics and subsequently generate the multivariate recurrence (and cross-recurrence) plots. Multivariate RQA is then used to detect damage-induced changes to the structural dynamics. These results are then compared with shifts in the plate's natural frequencies. Two of the RQA-based features are found to be more sensitive to damage than are the plate's frequencies.

  20. Quantification of contaminants associated with LDEF

    NASA Technical Reports Server (NTRS)

    Crutcher, E. R.; Nishimura, L. S.; Warner, K. J.; Wascher, W. W.

    1992-01-01

    The quantification of contaminants on the Long Duration Exposure Facility (LDEF) and associated hardware or tools is addressed. The purpose of this study was to provide a background data base for the evaluation of the surface of the LDEF and the effects of orbital exposure on that surface. This study necessarily discusses the change in the distribution of contaminants on the LDEF with time and environmental exposure. Much of this information may be of value for the improvement of contamination control procedures during ground based operations. The particulate data represents the results of NASA contractor monitoring as well as the results of samples collected and analyzed by the authors. The data from the tapelifts collected in the Space Shuttle Bay at Edwards Air Force Base and KSC are also presented. The amount of molecular film distributed over the surface of the LDEF is estimated based on measurements made at specific locations and extrapolated over the surface area of the LDEF. Some consideration of total amount of volatile-condensible materials available to form the resultant deposit is also presented. All assumptions underlying these estimates are presented along with the rationale for the conclusions. Each section is presented in a subsection for particles and another for molecular films.

  1. Quantification of periodic breathing in premature infants

    PubMed Central

    Mohr, Mary A.; Fairchild, Karen D.; Patel, Manisha; Sinkin, Robert A.; Clark, Matthew T.; Moorman, J. Randall; Lake, Douglas E.; Kattwinkel, John; Delos, John B.

    2015-01-01

    Background Periodic breathing (PB), regular cycles of short apneic pauses and breaths, is common in newborn infants. To characterize normal and potentially pathologic PB, we used our automated apnea detection system and developed a novel method for quantifying PB. We identified a preterm infant who died of SIDS and who, on review of her breathing pattern while in the NICU, had exaggerated PB. Methods We analyzed the chest impedance signal for short apneic pauses and developed a wavelet transform method to identify repetitive 10–40 second cycles of apnea/breathing. Clinical validation was performed to distinguish PB from apnea clusters and determine the wavelet coefficient cutoff having optimum diagnostic utility. We applied this method to analyze the chest impedance signals throughout the entire NICU stays of all 70 infants born at 32 weeks’ gestation admitted over a two-and-a-half year period. This group includes an infant who died of SIDS and her twin. Results For infants of 32 weeks’ gestation, the fraction of time spent in PB peaks 7–14 days after birth at 6.5%. During that time the infant that died of SIDS spent 40% of each day in PB and her twin spent 15% of each day in PB. Conclusions This wavelet transform method allows quantification of normal and potentially pathologic PB in NICU patients. PMID:26012526

  2. Broadband acoustic quantification of stratified turbulence.

    PubMed

    Lavery, Andone C; Geyer, W Rockwell; Scully, Malcolm E

    2013-07-01

    High-frequency broadband acoustic scattering techniques have enabled the remote, high-resolution imaging and quantification of highly salt-stratified turbulence in an estuary. Turbulent salinity spectra in the stratified shear layer have been measured acoustically and by in situ turbulence sensors. The acoustic frequencies used span 120-600 kHz, which, for the highly stratified and dynamic estuarine environment, correspond to wavenumbers in the viscous-convective subrange (500-2500 m(-1)). The acoustically measured spectral levels are in close agreement with spectral levels measured with closely co-located micro-conductivity probes. The acoustically measured spectral shapes allow discrimination between scattering dominated by turbulent salinity microstructure and suspended sediments or swim-bladdered fish, the two primary sources of scattering observed in the estuary in addition to turbulent salinity microstructure. The direct comparison of salinity spectra inferred acoustically and by the in situ turbulence sensors provides a test of both the acoustic scattering model and the quantitative skill of acoustical remote sensing of turbulence dissipation in a strongly sheared and salt-stratified estuary. PMID:23862783

  3. Legionella spp. isolation and quantification from greywater.

    PubMed

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  4. Uncertainty quantification in reacting flow modeling.

    SciTech Connect

    Le MaÒitre, Olivier P.; Reagan, Matthew T.; Knio, Omar M.; Ghanem, Roger Georges; Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  5. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  6. Quality Quantification of Evaluated Cross Section Covariances

    SciTech Connect

    Varet, S.; Dossantos-Uzarralde, P.

    2015-01-15

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the {sup 85}Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations.

  7. Uncertainty Quantification of Modelling of Equiaxed Solidification

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2016-07-01

    Numerical simulations of metal alloy solidification are used to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to sparse experimental data, to which agreement can be misinterpreted due to both model and experimental uncertainty. Uncertainty quantification (UQ) and sensitivity analysis are performed on a transient model of solidification of Al-4.5 wt.% Cu in a rectangular cavity, with equiaxed (grain refined) solidification morphology. This model solves equations for momentum, temperature, and species conservation; UQ and sensitivity analysis are performed for the degree of macrosegregation. A Smolyak sparse grid algorithm is used to select input values to construct a response surface fit to model outputs. The response surface is then used as a surrogate for the solidification model to determine the sensitivities and probability density functions of the model outputs. Uncertain model inputs of interest include the secondary dendrite arm spacing, equiaxed particle size, and fraction solid at which the rigid mushy zone forms. Similar analysis was also performed on a transient model of direct chill casting of the same alloy.

  8. Legionella spp. isolation and quantification from greywater

    PubMed Central

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  9. Isolation, quantification, and analysis of chloroplast DNA.

    PubMed

    Rowan, Beth A; Bendich, Arnold J

    2011-01-01

    Many areas of chloroplast research require methods that can assess the quality and quantity of chloroplast DNA (cpDNA). The study of chloroplast functions that depend on the proper maintenance and expression of the chloroplast genome, understanding cpDNA replication and repair, and the development of technologies for chloroplast transformation are just some of the disciplines that require the isolation of high-quality cpDNA. Arabidopsis thaliana offers several advantages for studying these processes because of the sizeable collection of mutants and natural varieties (accessions) available from stock centers and a broad community of researchers that has developed many other genetic resources. Several approaches for the isolation and quantification of cpDNA have been developed, but little consideration has been given to the strengths and weaknesses and the type of information obtained by each method, especially with respect to A. thaliana. Here, we provide protocols for obtaining high-quality cpDNA for PCR and other applications, and we evaluate several different isolation and analytical methods in order to build a robust framework for the study of cpDNA with this model organism. PMID:21822838

  10. Spatialised fate factors for nitrate in catchments: modelling approach and implication for LCA results.

    PubMed

    Basset-Mens, Claudine; Anibar, Lamiaa; Durand, Patrick; van der Werf, Hayo M G

    2006-08-15

    The challenge for environmental assessment tools, such as Life Cycle Assessment (LCA) is to provide a holistic picture of the environmental impacts of a given system, while being relevant both at a global scale, i.e., for global impact categories such as climate change, and at a smaller scale, i.e., for regional impact categories such as aquatic eutrophication. To this end, the environmental mechanisms between emission and impact should be taken into account. For eutrophication in particular, which is one of the main impacts of farming systems, the fate factor of eutrophying pollutants in catchments, and particularly of nitrate, reflects one of these important and complex environmental mechanisms. We define this fate factor as: the ratio of the amount of nitrate at the outlet of the catchment over the nitrate emitted from the catchment's soils. In LCA, this fate factor is most often assumed equal to 1, while the observed fate factor is generally less than 1. A generic approach for estimating the range of variation of nitrate fate factors in a region of intensive agriculture was proposed. This approach was based on the analysis of different catchment scenarios combining different catchment types and different effective rainfalls. The evolution over time of the nitrate fate factor as well as the steady state fate factor for each catchment scenario was obtained using the INCA simulation model. In line with the general LCA model, the implications of the steady state fate factors for nitrate were investigated for the eutrophication impact result in the framework of an LCA of pig production. A sensitivity analysis to the fraction of nitrate lost as N(2)O was presented for the climate change impact category. This study highlighted the difference between the observed fate factor at a given time, which aggregates both storage and transformation processes and a "steady state fate factor", specific to the system considered. The range of steady state fate factors obtained for the study region was wide, from 0.44 to 0.86, depending primarily on the catchment type and secondarily on the effective rainfall. The sensitivity of the LCA of pig production to the fate factors was significant concerning eutrophication, but potentially much larger concerning climate change. The potential for producing improved eutrophication results by using spatially differentiated fate factors was demonstrated. Additionally, the urgent need for quantitative studies on the N(2)O/N(2) ratio in riparian zones denitrification was highlighted. PMID:16488466

  11. Designing a simple physically-based bucket SVAT model for spatialisation of water needs

    NASA Astrophysics Data System (ADS)

    Lakhal, A.; Boulet, G.; Lakhal, L.; Er-Raki, S.; Duchemin, B.; Chehbouni, G.; Timouk, F.

    2003-04-01

    Within the frame of both IRRIMED and SUDMED projects one needs a robust and simple tool to provide space-time estimates of the water requirements in flat semi-arid agricultural zones. This is the task of the simplest water balance equations, which can be seen as simple SVAT schemes. Most of the simplest SVAT schemes use the classical bucket representation of soil moisture exchange through the soil-canopy-air continuum. They usually rely on empirical relationships such as the “beta function” that are not well suited for all climate, soil and vegetation conditions. Some of them for instance greatly simplify the deep drainage parameterization, or overlook the first to second stage evaporation processes. Several authors have proposed physically-based simple expressions, such as the desorptive approach, which gives accurate integrated capillary flows under constant boundary conditions. We propose here a simple SVAT schemes that uses the same approach but reduces as much as possible the number of empirical relationships. It is tested against 1) a physically based complex SVAT scheme SiSPAT and 2) experimental data acquired during the SALSA and the SUDMED field experiments in Mexico and Morocco (respectively) for a large range of vegetation types (olive trees, wheat crop, grassland). This simple SVAT is well suited to simulate long time series of soil moisture evolution, and proves to give accurate predictions of first to second-stage evaporation time series for the bare soil and fully vegetated cover conditions. An insight into model adjustment for sparse vegetation (which usually prevails under semi-arid conditions) is proposed and partially evaluated against SiSPAT outputs.

  12. Eine Übersicht zu Methoden und Anwendungen der Validierung von Vulnerabilitätsbewertungen

    NASA Astrophysics Data System (ADS)

    Neukum, Christoph

    2013-03-01

    Groundwater vulnerability maps have been applied over the past several decades for assessing groundwater sensitivity to pollution. Many different methods with various approaches and associated information content have been developed over the years. However, application of different methods to the same areas may lead to different or even contradictive results that may render vulnerability mapping unreliable. This manuscript presents a selection of methods that have been applied to validate vulnerability mapping approaches with different boundary conditions at various scales. The validation approaches are explained and their advantages and disadvantages are discussed. A key result is that validation is an important part of vulnerability mapping and that it contributes to a sound interpretation.

  13. Erratum: Erratum zu: Integration der bodenkundlichen Filter- und Pufferfunktion in die hydrogeologische Vulnerabilitätsbewertung

    NASA Astrophysics Data System (ADS)

    Wirsing, Tobias; Neukum, Christoph; Goldscheider, Nico; Maier, Matthias

    2015-09-01

    Vulnerability maps are standard tools for the assessment of groundwater sensitivity to contamination. Due to their increased use in technical guidelines, vulnerability maps have become state-of-the-art tools in resource management. However, most approaches have been developed by hydrogeologists and soil scientists who incorporate the understanding of processes from their specific disciplines very well but have limitations in considering processes in other disciplines. A soil-specific database for vulnerability assessment has been significantly improved by soil scientists over the past several years and includes quality, spatial extension and availability. Hence, it is time to integrate this database into hydrogeological concepts. This work presents a vulnerability mapping approach that considers a new soil database that has been available since 2014 for the entire Baden-Württemberg region at a scale of 1:50.000, adapting the well-established GLA and PI methods. Due to the newly-developed classification scheme for the protective function, this approach provides a more balanced and meaningful classification. This leads to a distinct image of the study area and a better interpretation of vulnerability.

  14. Quantification of isotopic turnover in agricultural systems

    NASA Astrophysics Data System (ADS)

    Braun, A.; Auerswald, K.; Schnyder, H.

    2012-04-01

    The isotopic turnover, which is a proxy for the metabolic rate, is gaining scientific importance. It is quantified for an increasing range of organisms, from microorganisms over plants to animals including agricultural livestock. Additionally, the isotopic turnover is analyzed on different scales, from organs to organisms to ecosystems and even to the biosphere. In particular, the quantification of the isotopic turnover of specific tissues within the same organism, e.g. organs like liver and muscle and products like milk and faeces, has brought new insights to improve understanding of nutrient cycles and fluxes, respectively. Thus, the knowledge of isotopic turnover is important in many areas, including physiology, e.g. milk synthesis, ecology, e.g. soil retention time of water, and medical science, e.g. cancer diagnosis. So far, the isotopic turnover is quantified by applying time, cost and expertise intensive tracer experiments. Usually, this comprises two isotopic equilibration periods. A first equilibration period with a constant isotopic input signal is followed by a second equilibration period with a distinct constant isotopic input signal. This yields a smooth signal change from the first to the second signal in the object under consideration. This approach reveals at least three major problems. (i) The input signals must be controlled isotopically, which is almost impossible in many realistic cases like free ranging animals. (ii) Both equilibration periods may be very long, especially when the turnover rate of the object under consideration is very slow, which aggravates the first problem. (iii) The detection of small or slow pools is improved by large isotopic signal changes, but large isotopic changes also involve a considerable change in the input material; e.g. animal studies are usually carried out as diet-switch experiments, where the diet is switched between C3 and C4 plants, since C3 and C4 plants differ strongly in their isotopic signal. The

  15. GPU-accelerated voxelwise hepatic perfusion quantification.

    PubMed

    Wang, H; Cao, Y

    2012-09-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10(-6). The method will be useful for generating liver perfusion images in clinical settings. PMID:22892645

  16. Quantification of asphaltene precipitation by scaling equation

    NASA Astrophysics Data System (ADS)

    Janier, Josefina Barnachea; Jalil, Mohamad Afzal B. Abd.; Samin, Mohamad Izhar B. Mohd; Karim, Samsul Ariffin B. A.

    2015-02-01

    Asphaltene precipitation from crude oil is one of the issues for the oil industry. The deposition of asphaltene occurs during production, transportation and separating process. The injection of carbon dioxide (CO2) during enhance oil recovery (EOR) is believed to contribute much to the precipitation of asphaltene. Precipitation can be affected by the changes in temperature and pressure on the crude oil however, reduction in pressure contribute much to the instability of asphaltene as compared to temperature. This paper discussed the quantification of precipitated asphaltene in crude oil at different high pressures and at constant temperature. The derived scaling equation was based on the reservoir condition with variation in the amount of carbon dioxide (CO2) mixed with Dulang a light crude oil sample used in the experiment towards the stability of asphaltene. A FluidEval PVT cell with Solid Detection System (SDS) was the instrument used to gain experimental knowledge on the behavior of fluid at reservoir conditions. Two conditions were followed in the conduct of the experiment. Firstly, a 45cc light crude oil was mixed with 18cc (40%) of CO2 and secondly, the same amount of crude oil sample was mixed with 27cc (60%) of CO2. Results showed that for a 45cc crude oil sample combined with 18cc (40%) of CO2 gas indicated a saturation pressure of 1498.37psi and asphaltene onset point was 1620psi. Then for the same amount of crude oil combined with 27cc (60%) of CO2, the saturation pressure was 2046.502psi and asphaltene onset point was 2230psi. The derivation of the scaling equation considered reservoir temperature, pressure, bubble point pressure, mole percent of the precipitant the injected gas CO2, and the gas molecular weight. The scaled equation resulted to a third order polynomial that can be used to quantify the amount of asphaltene in crude oil.

  17. Quantification of nitrotyrosine in nitrated proteins

    PubMed Central

    Zhang, Yingyi; Pöschl, Ulrich

    2010-01-01

    For kinetic studies of protein nitration reactions, we have developed a method for the quantification of nitrotyrosine residues in protein molecules by liquid chromatography coupled to a diode array detector of ultraviolet-visible absorption. Nitrated bovine serum albumin (BSA) and nitrated ovalbumin (OVA) were synthesized and used as standards for the determination of the protein nitration degree (ND), which is defined as the average number of nitrotyrosine residues divided by the total number of tyrosine residues in a protein molecule. The obtained calibration curves of the ratio of chromatographic peak areas of absorbance at 357 and at 280 nm vs. nitration degree are nearly the same for BSA and OVA (relative deviations <5%). They are near-linear at low ND (< 0.1) and can be described by a second-order polynomial fit up to \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$ {\\hbox{ND}} = 0.5\\left( {{R^2} > 0.99} \\right) $$\\end{document}. A change of chromatographic column led to changes in absolute peak areas but not in the peak area ratios and related calibration functions, which confirms the robustness of the analytical method. First results of laboratory experiments confirm that the method is applicable for the investigation of the reaction kinetics of protein nitration. The main advantage over alternative methods is that nitration degrees can be efficiently determined without hydrolysis or digestion of the investigated protein molecules. PMID:20300739

  18. Extended quantification of the generalized recurrence plot

    NASA Astrophysics Data System (ADS)

    Riedl, Maik; Marwan, Norbert; Kurths, Jürgen

    2016-04-01

    The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing structures, turbulent spatial plankton patterns, and fractals. But, it is also successfully applied to the description of spatio-temporal dynamics and the detection of regime shifts, such as in the complex Ginzburg-Landau- equation. The recurrence plot based determinism is a central measure in this framework quantifying the level of regularities in temporal and spatial structures. We extend this measure for the generalized recurrence plot considering additional operations of symmetry than the simple translation. It is tested not only on two-dimensional regular patterns and noise but also on complex spatial patterns reconstructing the parameter space of the complex Ginzburg-Landau-equation. The extended version of the determinism resulted in values which are consistent to the original recurrence plot approach. Furthermore, the proposed method allows a split of the determinism into parts which based on laminar and non-laminar regions of the two-dimensional pattern of the complex Ginzburg-Landau-equation. A comparison of these parts with a standard method of image classification, the co-occurrence matrix approach, shows differences especially in the description of patterns associated with turbulence. In that case, it seems that the extended version of the determinism allows a distinction of phase turbulence and defect turbulence by means of their spatial patterns. This ability of the proposed method promise new insights in other systems with turbulent dynamics coming from climatology, biology, ecology, and social sciences, for example.

  19. Rapid digital quantification of microfracture populations

    NASA Astrophysics Data System (ADS)

    Gomez, Leonel A.; Laubach, Stephen E.

    2006-03-01

    Populations of microfractures are a structural fabric in many rocks deformed at upper crustal conditions. In some cases these fractures are visible in transmitted-light microscopy as fluid-inclusion planes or cement filled microfractures, but because SEM-based cathodoluminescence (CL) reveals more fractures and delineates their shapes, sizes, and crosscutting relations, it is a more effective structural tool. Yet at magnifications of 150-300×, at which many microfractures are visible, SEM-CL detectors image only small sample areas (0.5-0.1 mm 2) relative to fracture population patterns. The substantial effort required to image and measure centimeter-size areas at high-magnification has impeded quantitative study of microfractures. We present a method for efficient collection of mosaics of high-resolution CL imagery, a preparation method that allows samples to be any size while retaining continuous imagery of rock (no gaps), and software that facilitates fracture mapping and data reduction. Although the method introduced here was developed for CL imagery, it can be used with any other kind of images, including mosaics from petrographic microscopes. Compared with manual measurements, the new method increases several fold the number of microfractures imaged without a proportional increase in level of effort, increases the accuracy and repeatability of fracture measurements, and speeds quantification and display of fracture population attributes. We illustrate the method on microfracture arrays in dolostone from NE Mexico and sandstone from NW Scotland. We show that key aspects of microfracture population attributes are only fully manifest at scales larger than a single thin section.

  20. Detection and Quantification of Citrullinated Chemokines

    PubMed Central

    Moelants, Eva A. V.; Van Damme, Jo; Proost, Paul

    2011-01-01

    Background Posttranslational deimination or citrullination by peptidylarginine deiminases (PAD) regulates the biological function of proteins and may be involved in the development of autoimmune diseases such as rheumatoid arthritis and multiple sclerosis. This posttranslational modification of arginine was recently discovered on inflammatory chemokines including CXCL8 and CXCL10, and significantly reduced their biological activity. To evaluate the importance of these modified chemokines in patients, methods for the detection and quantification of citrullinated chemokines are needed. Since citrullination only results in an increase of the protein mass with one mass unit and the loss of one positive charge, selective biochemical detection is difficult. Therefore, we developed an antibody-based method to specifically detect and quantify citrullination on a protein of interest. Methodology/Principal Findings First, the citrullinated proteins were chemically modified with antipyrine and 2,3-butanedione at low pH. Such selectively modified citrullines were subsequently detected and quantified by specific antibodies raised against a modified citrulline-containing peptide. The specificity of this two-step procedure was validated for citrullinated CXCL8 ([Cit5]CXCL8). Specific detection of [Cit5]CXCL8 concentrations between 1 and 50 ng/ml was possible, also in complex samples containing an excess of contaminating proteins. This novel detection method was used to evaluate the effect of lipopolysaccharide (LPS) on the citrullination of inflammatory chemokines induced in peripheral blood mononuclear cells (PBMCs) and granulocytes. LPS had no significant effect on the induction of CXCL8 citrullination in human PBMCs and granulocytes. However, granulocytes, known to contain PAD, were essential for the production of significant amounts of [Cit5]CXCL8. Conclusion/Significance The newly developed antibody-based method to specifically detect and quantify chemically modified

  1. Statistical Quantification of Methylation Levels by Next-Generation Sequencing

    PubMed Central

    Wu, Guodong; Yi, Nengjun; Absher, Devin; Zhi, Degui

    2011-01-01

    Background/Aims Recently, next-generation sequencing-based technologies have enabled DNA methylation profiling at high resolution and low cost. Methyl-Seq and Reduced Representation Bisulfite Sequencing (RRBS) are two such technologies that interrogate methylation levels at CpG sites throughout the entire human genome. With rapid reduction of sequencing costs, these technologies will enable epigenotyping of large cohorts for phenotypic association studies. Existing quantification methods for sequencing-based methylation profiling are simplistic and do not deal with the noise due to the random sampling nature of sequencing and various experimental artifacts. Therefore, there is a need to investigate the statistical issues related to the quantification of methylation levels for these emerging technologies, with the goal of developing an accurate quantification method. Methods In this paper, we propose two methods for Methyl-Seq quantification. The first method, the Maximum Likelihood estimate, is both conceptually intuitive and computationally simple. However, this estimate is biased at extreme methylation levels and does not provide variance estimation. The second method, based on Bayesian hierarchical model, allows variance estimation of methylation levels, and provides a flexible framework to adjust technical bias in the sequencing process. Results We compare the previously proposed binary method, the Maximum Likelihood (ML) method, and the Bayesian method. In both simulation and real data analysis of Methyl-Seq data, the Bayesian method offers the most accurate quantification. The ML method is slightly less accurate than the Bayesian method. But both our proposed methods outperform the original binary method in Methyl-Seq. In addition, we applied these quantification methods to simulation data and show that, with sequencing depth above 40–300 (which varies with different tissue samples) per cleavage site, Methyl-Seq offers a comparable quantification

  2. Quantification of chemical gaseous plumes on hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Niu, Sidi

    The passive remote chemical plume quantification problem may be approached from multiple aspects, corresponding to a variety of physical effects that may be exploited. Accordingly, a diversity of statistical quantification algorithms has been proposed in the literature. The ultimate performance and algorithmic complexity of each is influenced by the assumptions made about the scene, which may include the presence of ancillary measurements or particular background/plume features that may or may not be present. In this work, we evaluate and investigate the advantages and limitations of a number of quantification algorithms that span a variety of such assumptions. With these in-depth insights we gain, a new quantification algorithm is proposed for single gas quantification which is superior to all state-of-the-art algorithms in every almost every aspects including applicability, accuracy, and efficiency. The new method, called selected-band algorithm, achieves its superior performance through an accurate estimation of the unobservable off-plume radiance. The reason why off-plume radiance is recoverable relies on a common observation that most chemical gases only exhibit strong absorptive behavior in certain spectral bands. Those spectral bands where the gas absorption is almost zero or small are ideal to carry out background estimation. In this thesis, the new selected-band algorithm is first derived from its favorable narrow-band sharp-featured gas and then extended to an iterative algorithm that suits all kinds of gases. The performance improvement is verified by simulated data for a variety of experimental settings.

  3. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques. PMID:25182968

  4. Absolute and relative quantification of RNA modifications via biosynthetic isotopomers

    PubMed Central

    Kellner, Stefanie; Ochel, Antonia; Thüring, Kathrin; Spenkuch, Felix; Neumann, Jennifer; Sharma, Sunny; Entian, Karl-Dieter; Schneider, Dirk; Helm, Mark

    2014-01-01

    In the resurging field of RNA modifications, quantification is a bottleneck blocking many exciting avenues. With currently over 150 known nucleoside alterations, detection and quantification methods must encompass multiple modifications for a comprehensive profile. LC–MS/MS approaches offer a perspective for comprehensive parallel quantification of all the various modifications found in total RNA of a given organism. By feeding 13C-glucose as sole carbon source, we have generated a stable isotope-labeled internal standard (SIL-IS) for bacterial RNA, which facilitates relative comparison of all modifications. While conventional SIL-IS approaches require the chemical synthesis of single modifications in weighable quantities, this SIL-IS consists of a nucleoside mixture covering all detectable RNA modifications of Escherichia coli, yet in small and initially unknown quantities. For absolute in addition to relative quantification, those quantities were determined by a combination of external calibration and sample spiking of the biosynthetic SIL-IS. For each nucleoside, we thus obtained a very robust relative response factor, which permits direct conversion of the MS signal to absolute amounts of substance. The application of the validated SIL-IS allowed highly precise quantification with standard deviations <2% during a 12-week period, and a linear dynamic range that was extended by two orders of magnitude. PMID:25129236

  5. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  6. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an

  7. Methane bubbling: from speculation to quantification

    NASA Astrophysics Data System (ADS)

    Grinham, A. R.; Dunbabin, M.; Yuan, Z.

    2013-12-01

    magnitude from 500 to 100 000 mg m-2 d-1 depending on time of day and water depth. Average storage bubble flux rates between reservoirs varied by two orders of magnitude from 1 200 to 15 000 mg m-2 d-1, with the primary driver likely to be catchment forest cover. The relative contribution of bubbling to total fluxes varied from 10% to more than 90% depending on the reservoir and time of sampling. This method was consistently shown to greatly improve the spatial mapping and quantification of methane bubbling rates from reservoir surfaces and reduces the uncertainty associated with the determining the relative contribution of bubbling to total flux.

  8. Next generation of food allergen quantification using mass spectrometric systems.

    PubMed

    Koeberl, Martina; Clarke, Dean; Lopata, Andreas L

    2014-08-01

    Food allergies are increasing worldwide and becoming a public health concern. Food legislation requires detailed declarations of potential allergens in food products and therefore an increased capability to analyze for the presence of food allergens. Currently, antibody-based methods are mainly utilized to quantify allergens; however, these methods have several disadvantages. Recently, mass spectrometry (MS) techniques have been developed and applied to food allergen analysis. At present, 46 allergens from 11 different food sources have been characterized using different MS approaches and some specific signature peptides have been published. However, quantification of allergens using MS is not routinely employed. This review compares the different aspects of food allergen quantification using advanced MS techniques including multiple reaction monitoring. The latter provides low limits of quantification for multiple allergens in simple or complex food matrices, while being robust and reproducible. This review provides an overview of current approaches to analyze food allergens, with specific focus on MS systems and applications. PMID:24824675

  9. Isobaric Labeling-Based Relative Quantification in Shotgun Proteomics

    PubMed Central

    2015-01-01

    Mass spectrometry plays a key role in relative quantitative comparisons of proteins in order to understand their functional role in biological systems upon perturbation. In this review, we review studies that examine different aspects of isobaric labeling-based relative quantification for shotgun proteomic analysis. In particular, we focus on different types of isobaric reagents and their reaction chemistry (e.g., amine-, carbonyl-, and sulfhydryl-reactive). Various factors, such as ratio compression, reporter ion dynamic range, and others, cause an underestimation of changes in relative abundance of proteins across samples, undermining the ability of the isobaric labeling approach to be truly quantitative. These factors that affect quantification and the suggested combinations of experimental design and optimal data acquisition methods to increase the precision and accuracy of the measurements will be discussed. Finally, the extended application of isobaric labeling-based approach in hyperplexing strategy, targeted quantification, and phosphopeptide analysis are also examined. PMID:25337643

  10. Superlattice band structure: New and simple energy quantification condition

    NASA Astrophysics Data System (ADS)

    Maiz, F.

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga0.5Al0.5As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  11. Symmetry quantification and mapping using convergent beam electron diffraction.

    PubMed

    Kim, Kyou-Hyun; Zuo, Jian-Min

    2013-01-01

    We propose a new algorithm to quantify symmetry recorded in convergent beam electron diffraction (CBED) patterns and use it for symmetry mapping in materials applications. We evaluate the effectiveness of the profile R-factor (R(p)) and the normalized cross-correlation coefficient (γ) for quantifying the amount of symmetry in a CBED pattern. The symmetry quantification procedures are automated and the algorithm is implemented as a DM (Digital Micrograph(©)) script. Experimental and simulated CBED patterns recorded from a Si single crystal are used to calibrate the proposed algorithm for the symmetry quantification. The proposed algorithm is then applied to a Si sample with defects to test the sensitivity of symmetry quantification to defects. Using the mirror symmetry as an example, we demonstrate that the normalized cross-correlation coefficient provides an effective and robust measurement of the symmetry recorded in experimental CBED patterns. PMID:23142747

  12. A quick colorimetric method for total lipid quantification in microalgae.

    PubMed

    Byreddy, Avinesh R; Gupta, Adarsha; Barrow, Colin J; Puri, Munish

    2016-06-01

    Discovering microalgae with high lipid productivity are among the key milestones for achieving sustainable biodiesel production. Current methods of lipid quantification are time intensive and costly. A rapid colorimetric method based on sulfo-phospho-vanillin (SPV) reaction was developed for the quantification of microbial lipids to facilitate screening for lipid producing microalgae. This method was successfully tested on marine thraustochytrid strains and vegetable oils. The colorimetric method results correlated well with gravimetric method estimates. The new method was less time consuming than gravimetric analysis and is quantitative for lipid determination, even in the presence of carbohydrates, proteins and glycerol. PMID:27050419

  13. Detection and quantification of chimerism by droplet digital PCR.

    PubMed

    George, David; Czech, Juliann; John, Bobby; Yu, Min; Jennings, Lawrence J

    2013-01-01

    Accurate quantification of chimerism and microchimerism is proving to be increasingly valuable for hematopoietic cell transplantation as well as non-transplant conditions. However, methods that are available to quantify low-level chimerism lack accuracy. Therefore, we developed and validated a method for quantifying chimerism based on digital PCR technology. We demonstrate accurate quantification that far exceeds what is possible with analog qPCR down to 0.01% with the potential to go even lower. Also, this method is inherently more informative than qPCR. We expect the advantages of digital PCR will make it the preferred method for chimerism analysis. PMID:23974275

  14. Quantification of Cellular Proliferation in Mouse Atherosclerotic Lesions.

    PubMed

    Fuster, José J

    2015-01-01

    Excessive cell proliferation within atherosclerotic plaques plays an important role in the progression of atherosclerosis. Macrophage proliferation in particular has become a major focus of attention in the cardiovascular field because it appears to mediate most of macrophage expansion in mouse atherosclerotic arteries. Therefore, quantification of cell proliferation is an essential part of the characterization of atherosclerotic plaques in experimental studies. This chapter describes two variants of a simple immunostaining protocol that allow for the quantification of cellular proliferation in mouse atherosclerotic lesions based on the detection of the proliferation-associated antigen Ki-67. PMID:26445791

  15. Quantification of toxicological effects for dichloromethane. Draft report (Final)

    SciTech Connect

    Not Available

    1990-04-01

    The source documents for background information used to develop the report on the quantification of toxicological effects for dichloromethane are the health assessment document (HAD) for dichloromethane and a subsequent addendum to the HAD (U.S. EPA, 1985b). In addition, some references published since 1985 are discussed. To summarize the results of the quantification of toxicological effects, a One-day Health Advisory of 10,000 ug/L for a 10-kg child was calculated, based on an acute oral study in rats reported by Kimura et al. (1971). No suitable data for the derivation of a Ten-day Health Advisory were found in the available literature.

  16. Brief review of uncertainty quantification for particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Farias, M. H.; Teixeira, R. S.; Koiller, J.; Santos, A. M.

    2016-07-01

    Metrological studies for particle image velocimetry (PIV) are recent in literature. An attempt to evaluate the uncertainty quantifications (UQ) of the PIV velocity field are in evidence. Therefore, a short review on main sources of uncertainty in PIV and available methodologies for its quantification are presented. In addition, the potential of some mathematical techniques, coming from the area of geometric mechanics and control, that could interest the fluids UQ community are highlighted as good possibilities. “We must measure what is measurable and make measurable what cannot be measured” (Galileo)

  17. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. PMID:23796718

  18. Clinical PET Myocardial Perfusion Imaging and Flow Quantification.

    PubMed

    Juneau, Daniel; Erthal, Fernanda; Ohira, Hiroshi; Mc Ardle, Brian; Hessian, Renée; deKemp, Robert A; Beanlands, Rob S B

    2016-02-01

    Cardiac PET imaging is a powerful tool for the assessment of coronary artery disease. Many tracers with different advantages and disadvantages are available. It has several advantages over single photon emission computed tomography, including superior accuracy and lower radiation exposure. It provides powerful prognostic information, which can help to stratify patients and guide clinicians. The addition of flow quantification enables better detection of multivessel disease while providing incremental prognostic information. Flow quantification provides important physiologic information, which may be useful to individualize patient therapy. This approach is being applied in some centers, but requires standardization before it is more widely applied. PMID:26590781

  19. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  20. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  1. Reliability quantification and visualization for electric microgrids

    NASA Astrophysics Data System (ADS)

    Panwar, Mayank

    and parallel with the area Electric Power Systems (EPS), (3) includes the local EPS and may include portions of the area EPS, and (4) is intentionally planned. A more reliable electric power grid requires microgrids to operate in tandem with the EPS. The reliability can be quantified through various metrics for performance measure. This is done through North American Electric Reliability Corporation (NERC) metrics in North America. The microgrid differs significantly from the traditional EPS, especially at asset level due to heterogeneity in assets. Thus, the performance cannot be quantified by the same metrics as used for EPS. Some of the NERC metrics are calculated and interpreted in this work to quantify performance for a single asset and group of assets in a microgrid. Two more metrics are introduced for system level performance quantification. The next step is a better representation of the large amount of data generated by the microgrid. Visualization is one such form of representation which is explored in detail and a graphical user interface (GUI) is developed as a deliverable tool to the operator for informative decision making and planning. Electronic appendices-I and II contain data and MATLAB© program codes for analysis and visualization for this work.

  2. Colorimetric Quantification and in Situ Detection of Collagen

    ERIC Educational Resources Information Center

    Esteban, Francisco J.; del Moral, Maria L.; Sanchez-Lopez, Ana M.; Blanco, Santos; Jimenez, Ana; Hernandez, Raquel; Pedrosa, Juan A.; Peinado, Maria A.

    2005-01-01

    A simple multidisciplinary and inexpensive laboratory exercise is proposed, in which the undergraduate student may correlate biochemical and anatomical findings. The entire practical session can be completed in one 2.5-3 hour laboratory period, and consists of the quantification of collagen and total protein content from tissue sections--without…

  3. Identification and quantification of methanogenic archaea in adult chicken ceca

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methanogens, members of the domain Archaea, have been isolated from various animals but few reports exists regarding the isolation of methanogens from chicken, goose, and turkey feces. By using molecular methods for the identification and quantification of methanogenic archea in adult chicken ceca,...

  4. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  5. Current Issues in the Quantification of Federal Reserved Water Rights

    NASA Astrophysics Data System (ADS)

    Brookshire, David S.; Watts, Gary L.; Merrill, James L.

    1985-11-01

    This paper examines the quantification of federal reserved water rights from legal, institutional, and economic perspectives. Special attention is directed toward Indian reserved water rights and the concept of practicably irrigable acreage. We conclude by examining current trends and exploring alternative approaches to the dilemma of quantifying Indian reserved water rights.

  6. Infectious Viral Quantification of Chikungunya Virus-Virus Plaque Assay.

    PubMed

    Kaur, Parveen; Lee, Regina Ching Hua; Chu, Justin Jang Hann

    2016-01-01

    The plaque assay is an essential method for quantification of infectious virus titer. Cells infected with virus particles are overlaid with a viscous substrate. A suitable incubation period results in the formation of plaques, which can be fixed and stained for visualization. Here, we describe a method for measuring Chikungunya virus (CHIKV) titers via virus plaque assays. PMID:27233264

  7. Comparison of DNA Quantification Methods for Next Generation Sequencing

    PubMed Central

    Robin, Jérôme D.; Ludlow, Andrew T.; LaRanger, Ryan; Wright, Woodring E.; Shay, Jerry W.

    2016-01-01

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library’s heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality. PMID:27048884

  8. DeMix-Q: Quantification-Centered Data Processing Workflow.

    PubMed

    Zhang, Bo; Käll, Lukas; Zubarev, Roman A

    2016-04-01

    For historical reasons, most proteomics workflows focus on MS/MS identification but consider quantification as the end point of a comparative study. The stochastic data-dependent MS/MS acquisition (DDA) gives low reproducibility of peptide identifications from one run to another, which inevitably results in problems with missing values when quantifying the same peptide across a series of label-free experiments. However, the signal from the molecular ion is almost always present among the MS(1)spectra. Contrary to what is frequently claimed, missing values do not have to be an intrinsic problem of DDA approaches that perform quantification at the MS(1)level. The challenge is to perform sound peptide identity propagation across multiple high-resolution LC-MS/MS experiments, from runs with MS/MS-based identifications to runs where such information is absent. Here, we present a new analytical workflow DeMix-Q (https://github.com/userbz/DeMix-Q), which performs such propagation that recovers missing values reliably by using a novel scoring scheme for quality control. Compared with traditional workflows for DDA as well as previous DIA studies, DeMix-Q achieves deeper proteome coverage, fewer missing values, and lower quantification variance on a benchmark dataset. This quantification-centered workflow also enables flexible and robust proteome characterization based on covariation of peptide abundances. PMID:26729709

  9. Quantification and Single-Spore Detection of Phakopsora pachyrhizi

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The microscopic identification and quantification of Phakopsora pachyrhizi spores from environmental samples, spore traps, and laboratory specimens can represent a challenge. Such reports, especially from passive spore traps, commonly describe the number of “rust-like” spores; for other forensic sa...

  10. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, M.; Aliberti, G.; Palmiotti, G.

    2015-01-15

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  11. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  12. Macroscopic inspection of ape feces: what's in a quantification method?

    PubMed

    Phillips, Caroline A; McGrew, William C

    2014-06-01

    Macroscopic inspection of feces has been used to investigate primate diet. The limitations of this method to identify food-items to species level have long been recognized, but ascertaining aspects of diet (e.g., folivory) are achievable by quantifying food-items in feces. Quantification methods applied include rating food-items using a scale of abundance, estimating their percentage volume, and weighing food-items. However, verification as to whether or not composition data differ, depending on which quantification method is used during macroscopic inspection, has not been done. We analyzed feces collected from ten adult chimpanzees (Pan troglodytes schweinfurthii) of the Kanyawara community in Kibale National Park, Uganda. We compare dietary composition totals obtained from using different quantification methods and ascertain if sieve mesh size influences totals calculated. Finally, this study validates findings from direct observation of feeding by the same individuals from whom the fecal samples had been collected. Contrasting diet composition totals obtained by using different quantification methods and sieve mesh sizes can influence folivory and frugivory estimates. However, our findings were based on the assumption that fibrous matter contained pith and leaf fragments only, which remains to be verified. We advocate macroscopic inspection of feces can be a valuable tool to provide a generalized overview of dietary composition for primate populations. As most populations remain unhabituated, scrutinizing and validating indirect measures are important if they are to be applied to further understand inter- and intra-species dietary variation. PMID:24482001

  13. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    PubMed

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-01-01

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality. PMID:27048884

  14. Quantification of Wheat Grain Arabinoxylans Using a Phloroglucinol Colorimetric Assay

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Arabinoxylans (AX) play a critical role in end-use quality and nutrition of wheat (Triticum aestivum L.). An efficient, accurate method of AX quantification is desirable as AX plays an important role in processing, end use quality and human health. The objective of this work was to evaluate a stand...

  15. THE QUANTIFICATION OF FUNCTIONAL LOAD--A LINGUISTIC PROBLEM.

    ERIC Educational Resources Information Center

    HOCKETT, C.F.

    MEASUREMENT CRITERIA ARE DEVELOPED FOR THE QUANTIFICATION OF THE FUNCTIONAL LOAD OF THE PHONEMES OF A LANGUAGE. THE CONCEPT OF FUNCTIONAL LOAD OR YIELD, FROM CERTAIN THEORIES OF LINGUISTIC CHANGE, STATES THAT SOME CONTRASTS BETWEEN THE DISTINCTIVE SOUNDS OF A LANGUAGE DO MORE WORK THAN OTHERS BY OCCURRING MORE FREQUENTLY AND IN MORE LINGUISTIC…

  16. Juvenile Hormone Extraction, Purification, and Quantification in Ants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Juvenile hormone (JH) is an important insect hormone known to have many effects on development, reproduction, and behavior in both solitary and social insects. A number of questions using ants as a model involve JH. This procedure allows for quantification of circulating levels of JH III, which can ...

  17. 15 CFR 990.52 - Injury assessment-quantification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Injury assessment-quantification. 990.52 Section 990.52 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT REGULATIONS NATURAL RESOURCE DAMAGE...

  18. Quantification of confocal images of biofilms grown on irregular surfaces.

    PubMed

    Sommerfeld Ross, Stacy; Tu, Mai Han; Falsetta, Megan L; Ketterer, Margaret R; Kiedrowski, Megan R; Horswill, Alexander R; Apicella, Michael A; Reinhardt, Joseph M; Fiegel, Jennifer

    2014-05-01

    Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515

  19. A Quantification Approach to Popular American Theatre: Outline.

    ERIC Educational Resources Information Center

    Woods, Alan

    A previously relatively unexplored area of theater history studies is the quantification of titles, authors, and locations of productions of plays in Canada and the United States. Little is known, for example, about the number of times any one play was staged, especially in the earlier days of American drama. A project which counts productions on…

  20. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect

    Salvatores, M.; Aliberti, G.; Palmiotti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  1. Statistical challenges in the quantification of gunshot residue evidence.

    PubMed

    Gauriot, Romain; Gunaratnam, Lawrence; Moroni, Rossana; Reinikainen, Tapani; Corander, Jukka

    2013-09-01

    The discharging of a gun results in the formation of extremely small particles known as gunshot residues (GSR). These may be deposited on the skin and clothing of the shooter, on other persons present, and on nearby items or surfaces. Several factors and their complex interactions affect the number of detectable GSR particles, which can deeply influence the conclusions drawn from likelihood ratios or posterior probabilities for prosecution hypotheses of interest. We present Bayesian network models for casework examples and demonstrate that probabilistic quantification of GSR evidence can be very sensitive to the assumptions concerning the model structure, prior probabilities, and the likelihood components. This finding has considerable implications for the use of statistical quantification of GSR evidence in the legal process. PMID:23822522

  2. Direct immunomagnetic quantification of lymphocyte subsets in blood.

    PubMed Central

    Brinchmann, J E; Vartdal, F; Gaudernack, G; Markussen, G; Funderud, S; Ugelstad, J; Thorsby, E

    1988-01-01

    A method is described where superparamagnetic polymer microspheres coated with monoclonal antibodies (MoAb) are used for the direct and fast quantification of the absolute number of cells of various lymphocyte subsets in blood. Blood samples were incubated with microspheres coated with a subset specific MoAb. Using a magnet the microsphere-rosetted cells were isolated and washed. Following lysis of the cell walls to detach the microspheres, the cell nuclei were stained with acridine orange and counted in a haemocytometer using an immunofluorescence microscope. With MoAb specific for CD2, CD4, CD8 and CD19, reproducible absolute counts of the corresponding lymphocyte subsets were obtained which correlated closely with those obtained by an indirect quantification method. PMID:3349645

  3. Luminometric Label Array for Quantification and Identification of Metal Ions.

    PubMed

    Pihlasalo, Sari; Montoya Perez, Ileana; Hollo, Niklas; Hokkanen, Elina; Pahikkala, Tapio; Härmä, Harri

    2016-05-17

    Quantification and identification of metal ions has gained interest in drinking water and environmental analyses. We have developed a novel label array method for the quantification and identification of metal ions in drinking water. This simple ready-to-go method is based on the nonspecific interactions of multiple unstable lanthanide chelates and nonantenna ligands with sample leading to a luminescence signal profile, unique to the sample components. The limit of detection at ppb concentration level and average coefficient of variation of 10% were achieved with the developed label array. The identification of 15 different metal ions including different oxidation states Cr(3+)/Cr(6+), Cu(+)/Cu(2+), Fe(2+)/Fe(3+), and Pb(2+)/Pb(4+) was demonstrated. Moreover, a binary mixture of Cu(2+) and Fe(3+) and ternary mixture of Cd(2+), Ni(2+), and Pb(2+) were measured and individual ions were distinguished. PMID:27086705

  4. Quantification of viable helminth eggs in samples of sewage sludge.

    PubMed

    Rocha, Maria Carolina Vieira da; Barés, Monica Eboly; Braga, Maria Cristina Borba

    2016-10-15

    For the application of sewage sludge as fertilizer, it is of fundamental importance the absence of pathogenic organisms, such as viable helminth eggs. Thus, the quantification of these organisms has to be carried out by means of the application of reliable and accurate methodologies. Nevertheless, until the present date, there is no consensus with regard to the adoption of a universal methodology for the detection and quantification of viable helminth eggs. It is therefore necessary to instigate a debate on the different protocols currently in use, as well as to assemble relevant information in order to assist in the development of a more comprehensive and accurate method to quantify viable helminth eggs in samples of sewage sludge and its derivatives. PMID:27470467

  5. Uncertainty Quantification and Validation for RANS Turbulence Models

    NASA Astrophysics Data System (ADS)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  6. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  7. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  8. Universal Quantification in a Constraint-Based Planner

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Frank, Jeremy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Constraints and universal quantification are both useful in planning, but handling universally quantified constraints presents some novel challenges. We present a general approach to proving the validity of universally quantified constraints. The approach essentially consists of checking that the constraint is not violated for all members of the universe. We show that this approach can sometimes be applied even when variable domains are infinite, and we present some useful special cases where this can be done efficiently.

  9. Quantification of Water Absorption and Transport in Parchment

    NASA Astrophysics Data System (ADS)

    Herringer, Susan N.; Bilheux, Hassina Z.; Bearman, Greg

    Neutron radiography was utilized to quantify water absorption and desorption in parchment at the High Flux Isotope Reactor CG-1D imaging facility at Oak Ridge National Laboratory (ORNL). Sequential 60s radiographs of sections of a 15th century parchment were taken as the parchment underwent wetting and drying cycles. This provided time-resolved visualization and quantification of water absorption and transport in parchment.

  10. Incorporating Functional Gene Quantification into Traditional Decomposition Models

    NASA Astrophysics Data System (ADS)

    Todd-Brown, K. E.; Zhou, J.; Yin, H.; Wu, L.; Tiedje, J. M.; Schuur, E. A. G.; Konstantinidis, K.; Luo, Y.

    2014-12-01

    Incorporating new genetic quantification measurements into traditional substrate pool models represents a substantial challenge. These decomposition models are built around the idea that substrate availablity, with environmental drivers, limit carbon dioxide respiration rates. In this paradigm, microbial communities optimally adapt to a given substrate and environment on much shorter time scales then the carbon flux of interest. By characterizing the relative shift in biomass of these microbial communities, we informed previously poorly constrained parameters in traditional decomposition models. In this study we coupled a 9 month laboratory incubation study with quantitative gene measurements with traditional CO2 flux measurements plus initial soil organic carbon quantification. GeoChip 5.0 was used to quantify the functional genes associated with carbon cycling at 2 weeks, 3 months and 9 months. We then combined the genes which 'collapsed' over the experiment and assumed that this tracked the relative change in the biomass associated with the 'fast' pool. We further assumed that this biomass was proportional to the 'fast' SOC pool and thus were able to constrain the relative change in the fast SOC pool in our 3-pool decomposition model. We found that biomass quantification described above, combined with traditional CO2 flux and SOC measurements, improve the transfer coefficient estimation in traditional decomposition models. Transfer coefficients are very difficult to characterized using traditional CO2 flux measurements, thus DNA quantification provides new and significant information about the system. Over a 100 year simulation, these new biologically informed parameters resulted in an additional 10% of SOC loss over the traditionally informed parameters.

  11. Pulsatility of Hypothalamo-Pituitary Hormones: A Challenge in Quantification.

    PubMed

    Keenan, Daniel M; Veldhuis, Johannes D

    2016-01-01

    Neuroendocrine systems control many of the most fundamental physiological processes, e.g., reproduction, growth, adaptations to stress, and metabolism. Each such system involves the hypothalamus, the pituitary, and a specific target gland or organ. In the quantification of the interactions among these components, biostatistical modeling has played an important role. In the present article, five key challenges to an understanding of the interactions of these systems are illustrated and discussed critically. PMID:26674550

  12. Diagnostic utility of droplet digital PCR for HIV reservoir quantification.

    PubMed

    Trypsteen, Wim; Kiselinova, Maja; Vandekerckhove, Linos; De Spiegelaere, Ward

    2016-01-01

    Quantitative real-time PCR (qPCR) is implemented in many molecular laboratories worldwide for the quantification of viral nucleic acids. However, over the last two decades, there has been renewed interest in the concept of digital PCR (dPCR) as this platform offers direct quantification without the need for standard curves, a simplified workflow and the possibility to extend the current detection limit. These benefits are of great interest in terms of the quantification of low viral levels in HIV reservoir research because changes in the dynamics of residual HIV reservoirs will be important to monitor HIV cure efforts. Here, we have implemented a systematic literature screening and text mining approach to map the use of droplet dPCR (ddPCR) in the context of HIV quantification. In addition, several technical aspects of ddPCR were compared with qPCR: accuracy, sensitivity, precision and reproducibility, to determine its diagnostic utility. We have observed that ddPCR was used in different body compartments in multiple HIV-1 and HIV-2 assays, with the majority of reported assays focusing on HIV-1 DNA-based applications (i.e. total HIV DNA). Furthermore, ddPCR showed a higher accuracy, precision and reproducibility, but similar sensitivity when compared to qPCR due to reported false positive droplets in the negative template controls with a need for standardised data analysis (i.e. threshold determination). In the context of a low level of detection and HIV reservoir diagnostics, ddPCR can offer a valid alternative to qPCR-based assays but before this platform can be clinically accredited, some remaining issues need to be resolved. PMID:27482456

  13. Neutron-encoded protein quantification by peptide carbamylation.

    PubMed

    Ulbrich, Arne; Merrill, Anna E; Hebert, Alexander S; Westphall, Michael S; Keller, Mark P; Attie, Alan D; Coon, Joshua J

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet. PMID:24178922

  14. Neutron-encoded protein quantification by peptide carbamylation

    PubMed Central

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2013-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet. PMID:24178922

  15. Near-optimal probabilistic RNA-seq quantification.

    PubMed

    Bray, Nicolas L; Pimentel, Harold; Melsted, Páll; Pachter, Lior

    2016-05-01

    We present kallisto, an RNA-seq quantification program that is two orders of magnitude faster than previous approaches and achieves similar accuracy. Kallisto pseudoaligns reads to a reference, producing a list of transcripts that are compatible with each read while avoiding alignment of individual bases. We use kallisto to analyze 30 million unaligned paired-end RNA-seq reads in <10 min on a standard laptop computer. This removes a major computational bottleneck in RNA-seq analysis. PMID:27043002

  16. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  17. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  18. The Challenges of Credible Thermal Protection System Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2013-01-01

    The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.

  19. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  20. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  1. In vivo behavior of NTBI revealed by automated quantification system.

    PubMed

    Ito, Satoshi; Ikuta, Katsuya; Kato, Daisuke; Lynda, Addo; Shibusa, Kotoe; Niizeki, Noriyasu; Toki, Yasumichi; Hatayama, Mayumi; Yamamoto, Masayo; Shindo, Motohiro; Iizuka, Naomi; Kohgo, Yutaka; Fujiya, Mikihiro

    2016-08-01

    Non-Tf-bound iron (NTBI), which appears in serum in iron overload, is thought to contribute to organ damage; the monitoring of serum NTBI levels may therefore be clinically useful in iron-overloaded patients. However, NTBI quantification methods remain complex, limiting their use in clinical practice. To overcome the technical difficulties often encountered, we recently developed a novel automated NTBI quantification system capable of measuring large numbers of samples. In the present study, we investigated the in vivo behavior of NTBI in human and animal serum using this newly established automated system. Average NTBI in healthy volunteers was 0.44 ± 0.076 μM (median 0.45 μM, range 0.28-0.66 μM), with no significant difference between sexes. Additionally, serum NTBI rapidly increased after iron loading, followed by a sudden disappearance. NTBI levels also decreased in inflammation. The results indicate that NTBI is a unique marker of iron metabolism, unlike other markers of iron metabolism, such as serum ferritin. Our new automated NTBI quantification method may help to reveal the clinical significance of NTBI and contribute to our understanding of iron overload. PMID:27086349

  2. Accurate mass spectrometry based protein quantification via shared peptides.

    PubMed

    Dost, Banu; Bandeira, Nuno; Li, Xiangqian; Shen, Zhouxin; Briggs, Steven P; Bafna, Vineet

    2012-04-01

    In mass spectrometry-based protein quantification, peptides that are shared across different protein sequences are often discarded as being uninformative with respect to each of the parent proteins. We investigate the use of shared peptides which are ubiquitous (~50% of peptides) in mass spectrometric data-sets for accurate protein identification and quantification. Different from existing approaches, we show how shared peptides can help compute the relative amounts of the proteins that contain them. Also, proteins with no unique peptide in the sample can still be analyzed for relative abundance. Our article uses shared peptides in protein quantification and makes use of combinatorial optimization to reduce the error in relative abundance measurements. We describe the topological and numerical properties required for robust estimates, and use them to improve our estimates for ill-conditioned systems. Extensive simulations validate our approach even in the presence of experimental error. We apply our method to a model of Arabidopsis thaliana root knot nematode infection, and investigate the differential role of several protein family members in mediating host response to the pathogen. PMID:22414154

  3. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    PubMed Central

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  4. Zonated quantification of steatosis in an entire mouse liver.

    PubMed

    Schwen, Lars Ole; Homeyer, André; Schwier, Michael; Dahmen, Uta; Dirsch, Olaf; Schenk, Arne; Kuepfer, Lars; Preusser, Tobias; Schenk, Andrea

    2016-06-01

    Many physiological processes and pathological conditions in livers are spatially heterogeneous, forming patterns at the lobular length scale or varying across the organ. Steatosis, a common liver disease characterized by lipids accumulating in hepatocytes, exhibits heterogeneity at both these spatial scales. The main goal of the present study was to provide a method for zonated quantification of the steatosis patterns found in an entire mouse liver. As an example application, the results were employed in a pharmacokinetics simulation. For the analysis, an automatic detection of the lipid vacuoles was used in multiple slides of histological serial sections covering an entire mouse liver. Lobuli were determined semi-automatically and zones were defined within the lobuli. Subsequently, the lipid content of each zone was computed. The steatosis patterns were found to be predominantly periportal, with a notable organ-scale heterogeneity. The analysis provides a quantitative description of the extent of steatosis in unprecedented detail. The resulting steatosis patterns were successfully used as a perturbation to the liver as part of an exemplary whole-body pharmacokinetics simulation for the antitussive drug dextromethorphan. The zonated quantification is also applicable to other pathological conditions that can be detected in histological images. Besides being a descriptive research tool, this quantification could perspectively complement diagnosis based on visual assessment of histological images. PMID:27104496

  5. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. PMID:26179515

  6. Nuclear and mitochondrial DNA quantification of various forensic materials.

    PubMed

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials. PMID:16427750

  7. Assessment methods for angiogenesis and current approaches for its quantification.

    PubMed

    AlMalki, Waleed Hassan; Shahid, Imran; Mehdi, Abeer Yousaf; Hafeez, Muhammad Hassan

    2014-01-01

    Angiogenesis is a physiological process which describes the development of new blood vessels from the existing vessels. It is a common and the most important process in the formation and development of blood vessels, so it is supportive in the healing of wounds and granulation of tissues. The different assays for the evaluation of angiogenesis have been described with distinct advantages and some limitations. In order to develop angiogenic and antiangiogenic techniques, continuous efforts have been resulted to give animal models for more quantitative analysis of angiogenesis. Most of the studies on angiogenic inducers and inhibitors rely on various models, both in vitro, in vivo and in ova, as indicators of efficacy. The angiogenesis assays are very much helpful to test efficacy of both pro- and anti- angiogenic agents. The development of non-invasive procedures for quantification of angiogenesis will facilitate this process significantly. The main objective of this review article is to focus on the novel and existing methods of angiogenesis and their quantification techniques. These findings will be helpful to establish the most convenient methods for the detection, quantification of angiogenesis and to develop a novel, well tolerated and cost effective anti-angiogenic treatment in the near future. PMID:24987169

  8. Gas plume quantification in downlooking hyperspectral longwave infrared images

    NASA Astrophysics Data System (ADS)

    Turcotte, Caroline S.; Davenport, Michael R.

    2010-10-01

    Algorithms have been developed to support quantitative analysis of a gas plume using down-looking airborne hyperspectral long-wave infrared (LWIR) imagery. The resulting gas quantification "GQ" tool estimates the quantity of one or more gases at each pixel, and estimates uncertainty based on factors such as atmospheric transmittance, background clutter, and plume temperature contrast. GQ uses gas-insensitive segmentation algorithms to classify the background very precisely so that it can infer gas quantities from the differences between plume-bearing pixels and similar non-plume pixels. It also includes MODTRAN-based algorithms to iteratively assess various profiles of air temperature, water vapour, and ozone, and select the one that implies smooth emissivity curves for the (unknown) materials on the ground. GQ then uses a generalized least-squares (GLS) algorithm to simultaneously estimate the most likely mixture of background (terrain) material and foreground plume gases. Cross-linking of plume temperature to the estimated gas quantity is very non-linear, so the GLS solution was iteratively assessed over a range of plume temperatures to find the best fit to the observed spectrum. Quantification errors due to local variations in the camera-topixel distance were suppressed using a subspace projection operator. Lacking detailed depth-maps for real plumes, the GQ algorithm was tested on synthetic scenes generated by the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software. Initial results showed pixel-by-pixel gas quantification errors of less than 15% for a Freon 134a plume.

  9. Theoretical limitations of quantification for noncompetitive sandwich immunoassays.

    PubMed

    Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom

    2015-11-01

    Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations. PMID:26342315

  10. Assessment methods for angiogenesis and current approaches for its quantification

    PubMed Central

    AlMalki, Waleed Hassan; Shahid, Imran; Mehdi, Abeer Yousaf; Hafeez, Muhammad Hassan

    2014-01-01

    Angiogenesis is a physiological process which describes the development of new blood vessels from the existing vessels. It is a common and the most important process in the formation and development of blood vessels, so it is supportive in the healing of wounds and granulation of tissues. The different assays for the evaluation of angiogenesis have been described with distinct advantages and some limitations. In order to develop angiogenic and antiangiogenic techniques, continuous efforts have been resulted to give animal models for more quantitative analysis of angiogenesis. Most of the studies on angiogenic inducers and inhibitors rely on various models, both in vitro, in vivo and in ova, as indicators of efficacy. The angiogenesis assays are very much helpful to test efficacy of both pro- and anti- angiogenic agents. The development of non-invasive procedures for quantification of angiogenesis will facilitate this process significantly. The main objective of this review article is to focus on the novel and existing methods of angiogenesis and their quantification techniques. These findings will be helpful to establish the most convenient methods for the detection, quantification of angiogenesis and to develop a novel, well tolerated and cost effective anti-angiogenic treatment in the near future. PMID:24987169

  11. Systematic Assessment of RNA-Seq Quantification Tools Using Simulated Sequence Data

    PubMed Central

    Chandramohan, Raghu; Wu, Po-Yen; Phan, John H.; Wang, May D.

    2016-01-01

    RNA-sequencing (RNA-seq) technology has emerged as the preferred method for quantification of gene and isoform expression. Numerous RNA-seq quantification tools have been proposed and developed, bringing us closer to developing expression-based diagnostic tests based on this technology. However, because of the rapidly evolving technologies and algorithms, it is essential to establish a systematic method for evaluating the quality of RNA-seq quantification. We investigate how different RNA-seq experimental designs (i.e., variations in sequencing depth and read length) affect various quantification algorithms (i.e., HTSeq, Cufflinks, and MISO). Using simulated data, we evaluate the quantification tools based on four metrics, namely: (1) total number of usable fragments for quantification, (2) detection of genes and isoforms, (3) correlation, and (4) accuracy of expression quantification with respect to the ground truth. Results show that Cufflinks is able to use the largest number of fragments for quantification, leading to better detection of genes and isoforms. However, HTSeq produces more accurate expression estimates. Moreover, each quantification algorithm is affected differently by varying sequencing depth and read length, suggesting that the selection of quantification algorithms should be application-dependent.

  12. Systematic development of a group quantification method using evaporative light scattering detector for relative quantification of ginsenosides in ginseng products.

    PubMed

    Lee, Gwang Jin; Shin, Byong-Kyu; Yu, Yun-Hyun; Ahn, Jongsung; Kwon, Sung Won; Park, Jeong Hill

    2016-09-01

    The determination for the contents of multi-components in ginseng products has come to the fore by demands of in-depth information, but the associated industries confront the high cost of securing pure standards for the continuous quality evaluation of the products. This study aimed to develop a prospective high-performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method for relative quantification of ginsenosides in ginseng products without a considerable change from the conventional gradient analysis. We investigated the effects of mobile phase composition and elution bandwidth, which are potential variables affecting the ELSD response in the gradient analysis. Similar ELSD response curves of nine major ginsenosides were obtained under the identical flow injection conditions, and the response increased as the percentage of organic solvent increased. The nine ginsenosides were divided into three groups to confirm the effect of elution bandwidth. The ELSD response significantly decreased in case of the late eluted ginsenoside in the individual groups under the isocratic conditions. With the consideration of the two important effects, stepwise changes of the gradient condition were carried out to reach a group quantification method. The inconsistent responses of the nine ginsenosides were reconstituted to three normalized responses by the stepwise changes of the gradient condition, and this result actualized relative quantification in the individual groups. The availability was confirmed by comparing the ginsenoside contents in a base material of ginseng products determined by the direct and group quantification method. The largest difference in the determination results from the two methods was 8.26%, and the difference of total contents was only 0.91%. PMID:27262109

  13. Evaluation of a new method for stenosis quantification from 3D x-ray angiography images

    NASA Astrophysics Data System (ADS)

    Betting, Fabienne; Moris, Gilles; Knoplioch, Jerome; Trousset, Yves L.; Sureda, Francisco; Launay, Laurent

    2001-05-01

    A new method for stenosis quantification from 3D X-ray angiography images has been evaluated on both phantom and clinical data. On phantoms, for the parts larger or equal to 3 mm, the standard deviation of the measurement error has always found to be less or equal to 0.4 mm, and the maximum measurement error less than 0.17 mm. No clear relationship has been observed between the performances of the quantification method and the acquisition FoV. On clinical data, the 3D quantification method proved to be more robust to vessel bifurcations than its 3D equivalent. On a total of 15 clinical cases, the differences between 2D and 3D quantification were always less than 0.7 mm. The conclusion is that stenosis quantification from 3D X-4ay angiography images is an attractive alternative to quantification from 2D X-ray images.

  14. The variability of manual and computer assisted quantification of multiple sclerosis lesion volumes.

    PubMed

    Mitchell, J R; Karlik, S J; Lee, D H; Eliasziw, M; Rice, G P; Fenster, A

    1996-01-01

    The high resolution and excellent soft tissue contrast of Magnetic Resonance Imaging (MRI) have enabled direct, noninvasive visualization of Multiple Sclerosis (MS) lesions in vivo. This has allowed the quantification of changes in the appearance of lesions in MR exams to be used as a measure of disease state. Nevertheless, accurate quantification techniques are subject to inter- and intra-operator variability, which may hinder monitoring of disease progression. We have developed a computer program to assist an experienced operator in the quantification of MS lesions in standard spin-echo MR exams. The accuracy of assisted and manual quantification under known conditions was studied using exams of a test phantom, while inter- and intra-operator reliability and variability were studied using exams of a MS patient. Results from the phantom study show that accuracy is improved by assisted quantification. The patient exam results indicate that assisted quantification reduced inter-operator variability from 0.34 to 0.17 cm3, and reduced intra-operator variability from 0.23 to 0.15 cm3. In addition, the minimum significant change between two successive measurements of lesion volume by the same operator was 0.64 cm3 for manual quantification and 0.42 cm3 for assisted quantification. For two different operators making successive measurements, the minimum significant change was 0.94 cm3 for manual quantification, but only 0.47 cm3 for assisted quantification. Finally, the number of lesions to be monitored for an average change in volume at a given power and significance level was reduced by a factor of 2-4 by assisted quantification. These results suggest that assisted quantification may have practical applications in clinical trials, especially those that are large, multicenter, or extended over time, and therefore require lesion measurements by one or more operators. PMID:8700036

  15. Automated epicardial fat volume quantification from non-contrast CT

    NASA Astrophysics Data System (ADS)

    Ding, Xiaowei; Terzopoulos, Demetri; Diaz-Zamudio, Mariana; Berman, Daniel S.; Slomka, Piotr J.; Dey, Damini

    2014-03-01

    Epicardial fat volume (EFV) is now regarded as a significant imaging biomarker for cardiovascular risk strat-ification. Manual or semi-automated quantification of EFV includes tedious and careful contour drawing of pericardium on fine image features. We aimed to develop and validate a fully-automated, accurate algorithm for EVF quantification from non-contrast CT using active contours and multiple atlases registration. This is a knowledge-based model that can segment both the heart and pericardium accurately by initializing the location and shape of the heart in large scale from multiple co-registered atlases and locking itself onto the pericardium actively. The deformation process is driven by pericardium detection, extracting only the white contours repre- senting the pericardium in the CT images. Following this step, we can calculate fat volume within this region (epicardial fat) using standard fat attenuation range. We validate our algorithm on CT datasets from 15 patients who underwent routine assessment of coronary calcium. Epicardial fat volume quantified by the algorithm (69.15 +/- 8.25 cm3) and the expert (69.46 +/- 8.80 cm3) showed excellent correlation (r = 0.96, p < 0.0001) with no significant differences by comparison of individual data points (p = 0.9). The algorithm achieved a Dice overlap of 0.93 (range 0.88 - 0.95). The total time was less than 60 sec on a standard windows computer. Our results show that fast accurate automated knowledge-based quantification of epicardial fat volume from non-contrast CT is feasible. To our knowledge, this is also the first fully automated algorithms reported for this task.

  16. Quantification of breast arterial calcification using full field digital mammography

    SciTech Connect

    Molloi, Sabee; Xu Tong; Ducote, Justin; Iribarren, Carlos

    2008-04-15

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K-0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  17. Uncertainty quantification for characterization of high enthalpy facilities

    NASA Astrophysics Data System (ADS)

    Villedieu, N.; Cappaert, J.; Garcia Galache, J. P.; Magin, T. E.

    2013-06-01

    The postflight analysis of a space mission requires accurate determination of the free-stream conditions for the trajectory. The Mach number, temperature, and pressure conditions can be rebuilt from the heat flux and pressure measured on the spacecraft by means of a Flush Air Data System (FADS). This instrumentation comprises a set of sensors flush mounted in the thermal protection system to measure the static pressure (pressure taps) and heat flux (calorimeters). Knowing that experimental data suffer from errors, this methodology needs to integrate quantification of uncertainties. Epistemic uncertainties on the models for chemistry in the bulk and at the wall (surface catalysis) should also be taken into account. To study this problem it is necessary to solve a stochastic backward problem. This paper focuses on a preliminary sensitivity analysis of the forward problem to understand which uncertainties need to be accounted for. In section 2, the uncertainty quantification methodologies used in this work are presented. Section 3 is dedicated to the one-dimensional (1D) simulations of the shock layer to identify which chemical reactions of the mechanism need to be accounted for in the Uncertainty Quantification (UQ). After this triage procedure, the two-dimensional (2D) axisymmetric flow around the blunt nose was simulated for two trajectory points of EXPERT (EXPErimental Reentry Test-bed) is simulated and the propagation of the uncertainties on the stagnation pressure and heat flux has been studied. To do this study, the open source software DAKOTA from Sandia National Laboratory [1] is coupled with two in-house codes: SHOCKING that simulates the evolution of the chemical relaxation in the shock layer [2], and COSMIC that simulates axisymmetric chemically reacting flows [3].

  18. Quantification of Carnosine-Aldehyde Adducts in Human Urine.

    PubMed

    da Silva Bispo, Vanderson; Di Mascio, Paolo; Medeiros, Marisa

    2014-10-01

    Lipid peroxidation generates several reactive carbonyl species, including 4-hydroxy-2-nonenal (HNE), acrolein (ACR), 4-hydroxy-2-hexenal (HHE) and malondialdehyde. One major pathwayof aldehydes detoxification is through conjugation with glutathione catalyzed by glutathione-S-transferases or, alternatively, by conjugation with endogenous histidine containing dipeptides, such as carnosine (CAR). In this study, on-line reverse-phase high-performance liquid chromatography (HPLC) separation with tandem mass spectrometry detection was utilized for the accurate quantification of CAR- ACR, CAR-HHE and CAR-HNE adducts in human urinary samples from non-smokers young adults. Standard adducts were prepared and isolated by HPLC. The results showed the presence of a new product from the reaction of CAR with ACR. This new adduct was completely characterized by HPLC/MS-MSn, 1H RMN, COSY and HSQC. The new HPLC/MS/MS methodology employing stable isotope-labeled internal standards (CAR-HHEd5 and CAR-HNEd11) was developed for adducts quantification. This methodology permits quantification of 10pmol CAR-HHE and 1pmol of CAR-ACR and CAR-HNE. Accurate determinations in human urine sample were performed and showed 4.65±1.71 to CAR-ACR, 5.13±1.76 to CAR-HHE and 5.99±3.19nmol/mg creatinine to CAR-HNE. Our results indicate that carnosine pathways can be an important detoxification route of a, ß -unsaturated aldehydes. Moreover, carnosine adducts may be useful as redox stress indicator. PMID:26461323

  19. A flexible numerical approach for quantification of epistemic uncertainty

    SciTech Connect

    Chen, Xiaoxiao; Park, Eun-Jae; Xiu, Dongbin

    2013-05-01

    In the field of uncertainty quantification (UQ), epistemic uncertainty often refers to the kind of uncertainty whose complete probabilistic description is not available, largely due to our lack of knowledge about the uncertainty. Quantification of the impacts of epistemic uncertainty is naturally difficult, because most of the existing stochastic tools rely on the specification of the probability distributions and thus do not readily apply to epistemic uncertainty. And there have been few studies and methods to deal with epistemic uncertainty. A recent work can be found in [J. Jakeman, M. Eldred, D. Xiu, Numerical approach for quantification of epistemic uncertainty, J. Comput. Phys. 229 (2010) 4648–4663], where a framework for numerical treatment of epistemic uncertainty was proposed. The method is based on solving an encapsulation problem, without using any probability information, in a hypercube that encapsulates the unknown epistemic probability space. If more probabilistic information about the epistemic variables is known a posteriori, the solution statistics can then be evaluated at post-process steps. In this paper, we present a new method, similar to that of Jakeman et al. but significantly extending its capabilities. Most notably, the new method (1) does not require the encapsulation problem to be in a bounded domain such as a hypercube; (2) does not require the solution of the encapsulation problem to converge point-wise. In the current formulation, the encapsulation problem could reside in an unbounded domain, and more importantly, its numerical approximation could be sought in L{sup p} norm. These features thus make the new approach more flexible and amicable to practical implementation. Both the mathematical framework and numerical analysis are presented to demonstrate the effectiveness of the new approach.

  20. A flexible numerical approach for quantification of epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoxiao; Park, Eun-Jae; Xiu, Dongbin

    2013-05-01

    In the field of uncertainty quantification (UQ), epistemic uncertainty often refers to the kind of uncertainty whose complete probabilistic description is not available, largely due to our lack of knowledge about the uncertainty. Quantification of the impacts of epistemic uncertainty is naturally difficult, because most of the existing stochastic tools rely on the specification of the probability distributions and thus do not readily apply to epistemic uncertainty. And there have been few studies and methods to deal with epistemic uncertainty. A recent work can be found in [J. Jakeman, M. Eldred, D. Xiu, Numerical approach for quantification of epistemic uncertainty, J. Comput. Phys. 229 (2010) 4648-4663], where a framework for numerical treatment of epistemic uncertainty was proposed. The method is based on solving an encapsulation problem, without using any probability information, in a hypercube that encapsulates the unknown epistemic probability space. If more probabilistic information about the epistemic variables is known a posteriori, the solution statistics can then be evaluated at post-process steps. In this paper, we present a new method, similar to that of Jakeman et al. but significantly extending its capabilities. Most notably, the new method (1) does not require the encapsulation problem to be in a bounded domain such as a hypercube; (2) does not require the solution of the encapsulation problem to converge point-wise. In the current formulation, the encapsulation problem could reside in an unbounded domain, and more importantly, its numerical approximation could be sought in Lp norm. These features thus make the new approach more flexible and amicable to practical implementation. Both the mathematical framework and numerical analysis are presented to demonstrate the effectiveness of the new approach.

  1. Quantification of Hepatic Steatosis With Dual-Energy Computed Tomography

    PubMed Central

    Artz, Nathan S.; Hines, Catherine D.G.; Brunner, Stephen T.; Agni, Rashmi M.; Kühn, Jens-Peter; Roldan-Alzate, Alejandro; Chen, Guang-Hong; Reeder, Scott B.

    2012-01-01

    Objective The aim of this study was to compare dual-energy computed tomography (DECT) and magnetic resonance imaging (MRI) for fat quantification using tissue triglyceride concentration and histology as references in an animal model of hepatic steatosis. Materials and Methods This animal study was approved by our institution's Research Animal Resource Center. After validation of DECT and MRI using a phantom consisting of different triglyceride concentrations, a leptin-deficient obese mouse model (ob/ob) was used for this study. Twenty mice were divided into 3 groups based on expected levels of hepatic steatosis: low (n = 6), medium (n = 7), and high (n = 7) fat. After MRI at 3 T, a DECT scan was immediately performed. The caudate lobe of the liver was harvested and analyzed for triglyceride concentration using a colorimetric assay. The left lateral lobe was also extracted for histology. Magnetic resonance imaging fat-fraction (FF) and DECT measurements (attenuation, fat density, and effective atomic number) were compared with triglycerides and histology. Results Phantom results demonstrated excellent correlation between triglyceride content and each of the MRI and DECT measurements (r2 ≥ 0.96, P ≤ 0.003). In vivo, however, excellent triglyceride correlation was observed only with attenuation (r2 = 0.89, P < 0.001) and MRI-FF (r2 = 0.92, P < 0.001). Strong correlation existed between attenuation and MRI-FF (r2 = 0.86, P < 0.001). Nonlinear correlation with histology was also excellent for attenuation and MRI-FF. Conclusions Dual-energy computed tomography (CT) data generated by the current Gemstone Spectral Imaging analysis tool do not improve the accuracy of fat quantification in the liver beyond what CT attenuation can already provide. Furthermore, MRI may provide an excellent reference standard for liver fat quantification when validating new CT or DECT methods in human subjects. PMID:22836309

  2. Nuclear Data Uncertainty Quantification: Past, Present and Future

    SciTech Connect

    Smith, D.L.

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  3. Quantification of protein concentration using UV absorbance and Coomassie dyes.

    PubMed

    Noble, James E

    2014-01-01

    The measurement of a solubilized protein concentration in solution is an important assay in biochemistry research and development labs for applications ranging from enzymatic studies to providing data for biopharmaceutical lot release. Spectrophotometric protein quantification assays are methods that use UV and visible spectroscopy to rapidly determine the concentration of protein, relative to a standard, or using an assigned extinction coefficient. Where multiple samples need measurement, and/or the sample volume and concentration is limited, preparations of the Coomassie dye commonly known as the Bradford assay can be used. PMID:24423263

  4. Progressive damage state evolution and quantification in composites

    NASA Astrophysics Data System (ADS)

    Patra, Subir; Banerjee, Sourav

    2016-04-01

    Precursor damage state quantification can be helpful for safety and operation of aircraft and defense equipment's. Damage develops in the composite material in the form of matrix cracking, fiber breakages and deboning, etc. However, detection and quantification of the damage modes at their very early stage is not possible unless modifications of the existing indispensable techniques are conceived, particularly for the quantification of multiscale damages at their early stage. Here, we present a novel nonlocal mechanics based damage detection technique for precursor damage state quantification. Micro-continuum physics is used by modifying the Christoffel equation. American society of testing and materials (ASTM) standard woven carbon fiber (CFRP) specimens were tested under Tension-Tension fatigue loading at the interval of 25,000 cycles until 500,000 cycles. Scanning Acoustic Microcopy (SAM) and Optical Microscopy (OM) were used to examine the damage development at the same interval. Surface Acoustic Wave (SAW) velocity profile on a representative volume element (RVE) of the specimen were calculated at the regular interval of 50,000 cycles. Nonlocal parameters were calculated form the micromorphic wave dispersion curve at a particular frequency of 50 MHz. We used a previously formulated parameter called "Damage entropy" which is a measure of the damage growth in the material calculated with the loading cycle. Damage entropy (DE) was calculated at every pixel on the RVE and the mean of DE was plotted at the loading interval of 25,000 cycle. Growth of DE with fatigue loading cycles was observed. Optical Imaging also performed at the interval of 25,000 cycles to investigate the development of damage inside the materials. We also calculated the mean value of the Surface Acoustic Wave (SAW) velocity and plotted with fatigue cycle which is correlated further with Damage Entropy (DE). Statistical analysis of the Surface Acoustic Wave profile (SAW) obtained at different

  5. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  6. Predicting human age with bloodstains by sjTREC quantification.

    PubMed

    Ou, Xue-ling; Gao, Jun; Wang, Huan; Wang, Hong-sheng; Lu, Hui-ling; Sun, Hong-yu

    2012-01-01

    The age-related decline of signal joint T-cell receptor rearrangement excision circles (sjTRECs) in human peripheral blood has been demonstrated in our previous study and other reports. Until now, only a few studies on sjTREC detection in bloodstain samples were reported, which were based on a small sample of subjects of a limited age range, although bloodstains are much more frequently encountered in forensic practice. In this present study, we adopted the sensitive Taqman real-time quantitative polymerase chain reaction (qPCR) method to perform sjTREC quantification in bloodstains from individuals ranging from 0-86 years old (n = 264). The results revealed that sjTREC contents in human bloodstains were declined in an age-dependent manner (r = -0.8712). The formula of age estimation was Age = -7.1815Y-42.458 ± 9.42 (Y dCt(TBP-sjTREC); 9.42 standard error). Furthermore, we tested for the influence of short- or long- storage time by analyzing fresh and stored bloodstains from the same individuals. Remarkably, no statistically significant difference in sjTREC contents was found between the fresh and old DNA samples over a 4-week of storage time. However, significant loss (0.16-1.93 dCt) in sjTREC contents was detected after 1.5 years of storage in 31 samples. Moreover, preliminary sjTREC quantification from up to 20-year-old bloodstains showed that though the sjTREC contents were detectable in all samples and highly correlated with donor age, a time-dependent decrease in the correlation coefficient r was found, suggesting the predicting accuracy of this described assay would be deteriorated in aged samples. Our findings show that sjTREC quantification might be also suitable for age prediction in bloodstains, and future researches into the time-dependent or other potential impacts on sjTREC quantification might allow further improvement of the predicting accuracy. PMID:22879970

  7. Recurrence plots and recurrence quantification analysis of human motion data

    NASA Astrophysics Data System (ADS)

    Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad

    2016-06-01

    The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.

  8. Quantification of skin wrinkles using low coherence interferometry

    NASA Astrophysics Data System (ADS)

    Oh, Jung-Taek; Kim, Beop-Min; Son, Sang-Ryoon; Lee, Sang-Won; Kim, Dong-Yoon; Kim, Youn-Soo

    2004-07-01

    We measure the skin wrinkle topology by means of low coherence interferometry (LCI), which forms the basis of the optical coherence tomography (OCT). The skin topology obtained using LCI and corresponding 2-D fast Fourier transform allow quantification of skin wrinkles. It took approximately 2 minutes to obtain 2.1 mm x 2.1 mm topological image with 4 um and 16 um resolutions in axial and transverse directions, respectively. Measurement examples show the particular case of skin contour change after-wrinkle cosmeceutical treatments and atopic dermatitis

  9. Quantification of risks from technology for improved plant reliability

    SciTech Connect

    Rode, D.M.

    1996-12-31

    One of the least understood and therefore appreciated threats to profitability are risks from power plant technologies such as steam generators, turbines, and electrical systems. To effectively manage technological risks, business decisions need to be based on knowledge. The scope of the paper describes a quantification or risk process that combines technical knowledge and judgments with commercial consequences. The three principle alternatives to manage risks as well as risk mitigation techniques for significant equipment within a power plant are reported. The result is to equip the decision maker with a comprehensive picture of the risk exposures enabling cost effective activities to be undertaken to improve a plant`s reliability.

  10. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    NASA Technical Reports Server (NTRS)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  11. Current peptidomics: applications, purification, identification, quantification, and functional analysis.

    PubMed

    Dallas, David C; Guerrero, Andres; Parker, Evan A; Robinson, Randall C; Gan, Junai; German, J Bruce; Barile, Daniela; Lebrilla, Carlito B

    2015-03-01

    Peptidomics is an emerging field branching from proteomics that targets endogenously produced protein fragments. Endogenous peptides are often functional within the body-and can be both beneficial and detrimental. This review covers the use of peptidomics in understanding digestion, and identifying functional peptides and biomarkers. Various techniques for peptide and glycopeptide extraction, both at analytical and preparative scales, and available options for peptide detection with MS are discussed. Current algorithms for peptide sequence determination, and both analytical and computational techniques for quantification are compared. Techniques for statistical analysis, sequence mapping, enzyme prediction, and peptide function, and structure prediction are explored. PMID:25429922

  12. Uncertainty quantification in fission cross section measurements at LANSCE

    SciTech Connect

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  13. Quantification of toxicological effects for dichloromethane. Final report

    SciTech Connect

    Not Available

    1992-01-01

    The document discusses the quantification of non-carcinogenic effects and carcinogenic effects for dichloromethane. The evaluation of non-carcinogenic effects includes a study of short and long term effects in animals and humans, as well as the development of the one-day, ten-day, and long term health advisories. The evaluation of carcinogenic effects includes a categorization of carcinogenic potential and risks estimates. There is a brief discussion on existing guidelines or standards and special considerations such as high risk groups.

  14. Current peptidomics: Applications, purification, identification, quantification, and functional analysis

    PubMed Central

    Dallas, David C.; Guerrero, Andres; Parker, Evan A.; Robinson, Randall C.; Gan, Junai; German, J. Bruce; Barile, Daniela; Lebrilla, Carlito B.

    2015-01-01

    Peptidomics is an emerging field branching from proteomics that targets endogenously produced protein fragments. Endogenous peptides are often functional within the body—and can be both beneficial and detrimental. This review covers the use of peptidomics in understanding digestion, and identifying functional peptides and biomarkers. Various techniques for peptide and glycopeptide extraction, both at analytical and preparative scales, and available options for peptide detection with MS are discussed. Current algorithms for peptide sequence determination, and both analytical and computational techniques for quantification are compared. Techniques for statistical analysis, sequence mapping, enzyme prediction, and peptide function, and structure prediction are explored. PMID:25429922

  15. Source-Code Instrumentation and Quantification of Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Aspect Oriented Programming (AOP) is making quantified programmatic assertions over programs that otherwise are not annotated to receive these assertions. Varieties of AOP systems are characterized by which quantified assertions they allow, what they permit in the actions of the assertions (including how the actions interact with the base code), and what mechanisms they use to achieve the overall effect. Here, we argue that all quantification is over dynamic events, and describe our preliminary work in developing a system that maps dynamic events to transformations over source code. We discuss possible applications of this system, particularly with respect to debugging concurrent systems.

  16. Quantification of quantum discord in a antiferromagnetic Heisenberg compound

    SciTech Connect

    Singh, H. Chakraborty, T. Mitra, C.

    2014-04-24

    An experimental quantification of concurrence and quantum discord from heat capacity (C{sub p}) measurement performed over a solid state system has been reported. In this work, thermodynamic measurements were performed on copper nitrate (CN, Cu(NO{sub 3}){sub 2}⋅2.5H{sub 2}O) single crystals which is an alternating antiferromagnet Heisenberg spin 1/2 system. CN being a weak dimerized antiferromagnet is an ideal system to investigate correlations between spins. The theoretical expressions were used to obtain concurrence and quantum discord curves as a function of temperature from heat capacity data of a real macroscopic system, CN.

  17. Aspect-Oriented Programming is Quantification and Implicit Invocation

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Koga, Dennis (Technical Monitor)

    2001-01-01

    We propose that the distinguishing characteristic of Aspect-Oriented Programming (AOP) languages is that they allow programming by making quantified programmatic assertions over programs that lack local notation indicating the invocation of these assertions. This suggests that AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the interactions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are metabolism: they are sufficiently expressive to allow straightforwardly programming an AOP system within them.

  18. Uncertainty Quantification in Fission Cross Section Measurements at LANSCE

    SciTech Connect

    Tovesson, F.

    2015-01-15

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  19. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    SciTech Connect

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene; Malinas, Rebecca; Peterson, Matthew Gregor; Phillips, Cynthia A.; Robinson, David G.; Woodbridge, Diane

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  20. [Quantification of levels of serum antirabies antibodies in vaccinated individuals].

    PubMed

    Süliová, J; Benísek, Z; Svrcek, S; Durove, A; Závadová, J

    1994-02-01

    The authors developed a kit for the purpose of assessment of anti-rabies antibodies by the ELISA immunoenzymatic method in human immunized sera. The results of the detection and quantification of anti-rabies antibodies acquired by the ELISA method were compared with those originating from classical procedures (virusneutralizing test on mice, indirect hemagglutination test), and a sufficient correlation and sensitivity of the immunoenzymatic method were detected. By means of the developed test it is possible to detect the particular level of anti-rabies virusneutralizing IgG antibodies. (Tab. 2, Fig. 1, Ref. 25). PMID:7922630

  1. Experimental validation of equations for 2D DIC uncertainty quantification.

    SciTech Connect

    Reu, Phillip L.; Miller, Timothy J.

    2010-03-01

    Uncertainty quantification (UQ) equations have been derived for predicting matching uncertainty in two-dimensional image correlation a priori. These equations include terms that represent the image noise and image contrast. Researchers at the University of South Carolina have extended previous 1D work to calculate matching errors in 2D. These 2D equations have been coded into a Sandia National Laboratories UQ software package to predict the uncertainty for DIC images. This paper presents those equations and the resulting error surfaces for trial speckle images. Comparison of the UQ results with experimentally subpixel-shifted images is also discussed.

  2. Development of magnetic resonance technology for noninvasive boron quantification

    SciTech Connect

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.

  3. Prospective Comparison of Liver Stiffness Measurements between Two Point Shear Wave Elastography Methods: Virtual Touch Quantification and Elastography Point Quantification

    PubMed Central

    Yoo, Hyunsuk; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo

    2016-01-01

    Objective To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Materials and Methods Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ2 analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). Results The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Conclusion Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement. PMID:27587964

  4. Volumetric loss quantification using ultrasonic inductively coupled transducers

    NASA Astrophysics Data System (ADS)

    Gong, Peng; Hay, Thomas R.; Greve, David W.; Oppenheim, Irving J.

    2015-03-01

    The pulse-echo method is widely used for plate and pipe thickness measurement. However, the pulse echo method does not work well for detecting localized volumetric loss in thick-wall tubes, as created by erosion damage, when the morphology of volumetric loss is irregular and can reflect ultrasonic pulses away from the transducer, making it difficult to detect an echo. In this paper, we propose a novel method using an inductively coupled transducer to generate longitudinal waves propagating in a thick-wall aluminum tube for the volumetric loss quantification. In the experiment, longitudinal waves exhibit diffraction effects during the propagation which can be explained by the Huygens-Fresnel principle. The diffractive waves are also shown to be significantly delayed by the machined volumetric loss on the inside surface of the thick-wall aluminum tube. It is also shown that the inductively coupled transducers can generate and receive similar ultrasonic waves to those from wired transducers, and the inductively coupled transducers perform as well as the wired transducers in the volumetric loss quantification when other conditions are the same.

  5. Quantification of HEV RNA by Droplet Digital PCR.

    PubMed

    Nicot, Florence; Cazabat, Michelle; Lhomme, Sébastien; Marion, Olivier; Sauné, Karine; Chiabrando, Julie; Dubois, Martine; Kamar, Nassim; Abravanel, Florence; Izopet, Jacques

    2016-01-01

    The sensitivity of real-time PCR for hepatitis E virus (HEV) RNA quantification differs greatly among techniques. Standardized tools that measure the real quantity of virus are needed. We assessed the performance of a reverse transcription droplet digital PCR (RT-ddPCR) assay that gives absolute quantities of HEV RNA. Analytical and clinical validation was done on HEV genotypes 1, 3 and 4, and was based on open reading frame (ORF)3 amplification. The within-run and between-run reproducibilities were very good, the analytical sensitivity was 80 HEV RNA international units (IU)/mL and linearities of HEV genotype 1, 3 and 4 were very similar. Clinical validation based on 45 samples of genotype 1, 3 or 4 gave results that correlated well with a validated reverse transcription quantitative PCR (RT-qPCR) assay (Spearman rs = 0.89, p < 0.0001). The RT-ddPCR assay is a sensitive method and could be a promising tool for standardizing HEV RNA quantification in various sample types. PMID:27548205

  6. Simple and inexpensive quantification of ammonia in whole blood.

    PubMed

    Ayyub, Omar B; Behrens, Adam M; Heligman, Brian T; Natoli, Mary E; Ayoub, Joseph J; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μL of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p = 0.0001. PMID:25936660

  7. An on-bacterium flow cytometric immunoassay for protein quantification.

    PubMed

    Lan, Wen-Jun; Lan, Wei; Wang, Hai-Yan; Yan, Lei; Wang, Zhe-Li

    2013-09-01

    The polystyrene bead-based flow cytometric immunoassay has been widely reported. However, the preparation of functional polystyrene bead is still inconvenient. This study describes a simple and easy on-bacterium flow cytometric immunoassay for protein quantification, in which Staphylococcus aureus (SAC) is used as an antibody-antigen carrier to replace the polystyrene bead. The SAC beads were prepared by carboxyfluorescein diacetate succinimidyl ester (CFSE) labeling, paraformaldehyde fixation and antibody binding. Carcinoembryonic antigen (CEA) and cytokeratin-19 fragment (CYFRA 21-1) proteins were used as models in the test system. Using prepared SAC beads, biotinylated proteins, and streptavidin-phycoerythrin (SA-PE), the on-bacterium flow cytometric immunoassay was validated by quantifying CEA and CYFRA 21-1 in sample. Obtained data demonstrated a concordant result between the logarithm of the protein concentration and the logarithm of the PE mean fluorescence intensity (MFI). The limit of detection (LOD) in this immunoassay was at least 0.25 ng/ml. Precision and accuracy assessments appeared that either the relative standard deviation (R.S.D.) or the relative error (R.E.) was <10%. The comparison between this immunoassay and a polystyrene bead-based flow cytometric immunoassay showed a correlation coefficient of 0.998 for serum CEA or 0.996 for serum CYFRA 21-1. In conclusion, the on-bacterium flow cytometric immunoassay may be of use in the quantification of serum protein. PMID:23739299

  8. Is HBsAg quantification ready, for prime time?

    PubMed

    Chevaliez, Stéphane

    2013-12-01

    Despite the availability of an efficient hepatitis B vaccine, approximately 240 million individuals are chronically infected with hepatitis B virus worldwide. One-fourth of hepatitis B surface antigen (HBsAg)-positive patients will develop complications, such as cirrhosis or hepatocellular carcinoma, both major causes of liver-related deaths. Antiviral therapies, such as pegylated interferon alpha or nucleoside/nucleotide analogues, are effective in suppressing HBV DNA and reducing the subsequent risk of fibrosis progression, cirrhosis and hepatocellular carcinoma. HBsAg has proven to be a steady, reliable marker of chronic HBV carriage that can also be used to predict clinical outcomes. Three commercial enzyme immunoassays are now available for HBsAg quantification. A number of recent studies have shown clinical utility of HBsAg quantification in combination with HBV DNA levels to identify inactive carriers who need antiviral therapy and in interferon treated-patients in order to predict the virological response to pegylated interferon alpha. PMID:23932705

  9. Functional error modeling for uncertainty quantification in hydrogeology

    NASA Astrophysics Data System (ADS)

    Josset, L.; Ginsbourger, D.; Lunati, I.

    2015-02-01

    Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

  10. Amperometric quantification based on serial dilution microfluidic systems.

    PubMed

    Stephan, Khaled; Pittet, Patrick; Sigaud, Monique; Renaud, Louis; Vittori, Olivier; Morin, Pierre; Ouaini, Naim; Ferrigno, Rosaria

    2009-03-01

    This paper describes a microfluidic device fabricated in poly(dimethylsiloxane) that was employed to perform amperometric quantifications using on-chip calibration curves and on-chip standard addition methods. This device integrated a network of Au electrodes within a microfluidic structure designed for automatic preparation of a series of solutions containing an electroactive molecule at a concentration linearly decreasing. This device was first characterized by fluorescence microscopy and then evaluated with a model electroactive molecule such as Fe(CN(6))(4-). Operating a quantification in this microfluidic parallel approach rather than in batch mode allows a reduced analysis time to be achieved. Moreover, the microfluidic approach is compatible with the on-chip calibration of sensors simultaneously to the analysis, therefore preventing problems due to sensor response deviation with time. When using the on-chip calibration and on-chip standard addition method, we reached concentration estimation better than 5%. We also demonstrated that compared to the calibration curve approach, the standard addition mode is less complex to operate. Indeed, in this case, it is not necessary to take into account flow rate discrepancies as in the calibration approach. PMID:19238282

  11. Automated quantification of nuclear immunohistochemical markers with different complexity.

    PubMed

    López, Carlos; Lejeune, Marylène; Salvadó, María Teresa; Escrivà, Patricia; Bosch, Ramón; Pons, Lluis E; Alvaro, Tomás; Roig, Jordi; Cugat, Xavier; Baucells, Jordi; Jaén, Joaquín

    2008-03-01

    Manual quantification of immunohistochemically stained nuclear markers is still laborious and subjective and the use of computerized systems for digital image analysis have not yet resolved the problems of nuclear clustering. In this study, we designed a new automatic procedure for quantifying various immunohistochemical nuclear markers with variable clustering complexity. This procedure consisted of two combined macros. The first, developed with a commercial software, enabled the analysis of the digital images using color and morphological segmentation including a masking process. All information extracted with this first macro was automatically exported to an Excel datasheet, where a second macro composed of four different algorithms analyzed all the information and calculated the definitive number of positive nuclei for each image. One hundred and eighteen images with different levels of clustering complexity was analyzed and compared with the manual quantification obtained by a trained observer. Statistical analysis indicated a great reliability (intra-class correlation coefficient > 0.950) and no significant differences between the two methods. Bland-Altman plot and Kaplan-Meier curves indicated that the results of both methods were concordant around 90% of analyzed images. In conclusion, this new automated procedure is an objective, faster and reproducible method that has an excellent level of accuracy, even with digital images with a high complexity. PMID:18172664

  12. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    PubMed

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets. PMID:27031878

  13. A Spanish model for quantification and management of construction waste.

    PubMed

    Solís-Guzmán, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramírez-de-Arellano, Antonio

    2009-09-01

    Currently, construction and demolition waste (C&D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C&D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C&D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C&D waste volume in both new construction and demolition projects. PMID:19523801

  14. A Spanish model for quantification and management of construction waste

    SciTech Connect

    Solis-Guzman, Jaime Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramirez-de-Arellano, Antonio

    2009-09-15

    Currently, construction and demolition waste (C and D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C and D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C and D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C and D waste volume in both new construction and demolition projects.

  15. Concurrent quantification of tryptophan and its major metabolites

    PubMed Central

    Lesniak, Wojciech G.; Jyoti, Amar; Mishra, Manoj K.; Louissaint, Nicolette; Romero, Roberto; Chugani, Diane C.; Kannan, Sujatha; Kannan, Rangaramanujam M.

    2014-01-01

    An imbalance in tryptophan (TRP) metabolites is associated with several neurological and inflammatory disorders. Therefore, analytical methods allowing for simultaneous quantification of TRP and its major metabolites would be highly desirable, and may be valuable as potential biomarkers. We have developed a HPLC method for concurrent quantitative determination of tryptophan, serotonin, 5-hydroxyindoleacetic acid, kynurenine, and kynurenic acid in tissue and fluids. The method utilizes the intrinsic spectroscopic properties of TRP and its metabolites that enable UV absorbance and fluorescence detection by HPLC, without additional labeling. The origin of the peaks related to analytes of interest was confirmed by UV–Vis spectral patterns using a PDA detector and mass spectrometry. The developed methods were validated in rabbit fetal brain and amniotic fluid at gestational day 29. Results are in excellent agreement with those reported in the literature for the same regions. This method allows for rapid quantification of tryptophan and four of its major metabolites concurrently. A change in the relative ratios of these metabolites can provide important insights in predicting the presence and progression of neuroinflammation in disorders such as cerebral palsy, autism, multiple sclerosis, Alzheimer disease, and schizophrenia. PMID:24036037

  16. Ultrasound strain imaging for quantification of tissue function: cardiovascular applications

    NASA Astrophysics Data System (ADS)

    de Korte, Chris L.; Lopata, Richard G. P.; Hansen, Hendrik H. G.

    2013-03-01

    With ultrasound imaging, the motion and deformation of tissue can be measured. Tissue can be deformed by applying a force on it and the resulting deformation is a function of its mechanical properties. Quantification of this resulting tissue deformation to assess the mechanical properties of tissue is called elastography. If the tissue under interrogation is actively deforming, the deformation is directly related to its function and quantification of this deformation is normally referred as `strain imaging'. Elastography can be used for atherosclerotic plaques characterization, while the contractility of the heart or skeletal muscles can be assessed with strain imaging. We developed radio frequency (RF) based ultrasound methods to assess the deformation at higher resolution and with higher accuracy than commercial methods using conventional image data (Tissue Doppler Imaging and 2D speckle tracking methods). However, the improvement in accuracy is mainly achieved when measuring strain along the ultrasound beam direction, so 1D. We further extended this method to multiple directions and further improved precision by using compounding of data acquired at multiple beam steered angles. In arteries, the presence of vulnerable plaques may lead to acute events like stroke and myocardial infarction. Consequently, timely detection of these plaques is of great diagnostic value. Non-invasive ultrasound strain compounding is currently being evaluated as a diagnostic tool to identify the vulnerability of plaques. In the heart, we determined the strain locally and at high resolution resulting in a local assessment in contrary to conventional global functional parameters like cardiac output or shortening fraction.

  17. Quantification of liver fibrosis in chronic hepatitis B virus infection

    PubMed Central

    Jieanu, CF; Ungureanu, BS; Săndulescu, DL; Gheonea, IA; Tudorașcu, DR; Ciurea, ME; Purcărea, VL

    2015-01-01

    Chronic hepatitis B virus infection (HBV) is considered a global public issue with more than 78.000 people per year dying of its evolution. With liver transplantation as the only viable therapeutic option but only in end-stage disease, hepatitis B progression may generally be influenced by various factors. Assessing fibrosis stage plays an important part in future decisions on the patients’ wealth with available antiviral agents capable of preventing fibrosis passing to an end-stage liver disease. Several methods have been taken into consideration as an alternative for HBV quantification status, such as imaging techniques and serum based biomarkers. Magnetic resonance imaging, ultrasound, and elastography are considered non-invasive imaging techniques frequently used to quantify disease progression as well as patients future prognostic. Consequently, both direct and indirect biomarkers have been studied for differentiating between fibrosis stages. This paper reviews the current standings in HBV non-invasive liver fibrosis quantification, presenting the prognostic factors and available assessment procedures that might eventually replace liver biopsy. PMID:26351528

  18. Quantification of HEV RNA by Droplet Digital PCR

    PubMed Central

    Nicot, Florence; Cazabat, Michelle; Lhomme, Sébastien; Marion, Olivier; Sauné, Karine; Chiabrando, Julie; Dubois, Martine; Kamar, Nassim; Abravanel, Florence; Izopet, Jacques

    2016-01-01

    The sensitivity of real-time PCR for hepatitis E virus (HEV) RNA quantification differs greatly among techniques. Standardized tools that measure the real quantity of virus are needed. We assessed the performance of a reverse transcription droplet digital PCR (RT-ddPCR) assay that gives absolute quantities of HEV RNA. Analytical and clinical validation was done on HEV genotypes 1, 3 and 4, and was based on open reading frame (ORF)3 amplification. The within-run and between-run reproducibilities were very good, the analytical sensitivity was 80 HEV RNA international units (IU)/mL and linearities of HEV genotype 1, 3 and 4 were very similar. Clinical validation based on 45 samples of genotype 1, 3 or 4 gave results that correlated well with a validated reverse transcription quantitative PCR (RT-qPCR) assay (Spearman rs = 0.89, p < 0.0001). The RT-ddPCR assay is a sensitive method and could be a promising tool for standardizing HEV RNA quantification in various sample types. PMID:27548205

  19. Accurate quantification of supercoiled DNA by digital PCR.

    PubMed

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-01-01

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry. PMID:27063649

  20. A critical view on microplastic quantification in aquatic organisms.

    PubMed

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J J; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-11-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g(-1) w.w. for the Acid mix Method and 0.12±0.04 total microplastics g(-1) w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g(-1) w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring. PMID:26249746

  1. Generation, Quantification, and Tracing of Metabolically Labeled Fluorescent Exosomes.

    PubMed

    Coscia, Carolina; Parolini, Isabella; Sanchez, Massimo; Biffoni, Mauro; Boussadia, Zaira; Zanetti, Cristiana; Fiani, Maria Luisa; Sargiacomo, Massimo

    2016-01-01

    Over the last 10 years, the constant progression in exosome (Exo)-related studies highlighted the importance of these cell-derived nano-sized vesicles in cell biology and pathophysiology. Functional studies on Exo uptake and intracellular trafficking require accurate quantification to assess sufficient and/or necessary Exo particles quantum able to elicit measurable effects on target cells. We used commercially available BODIPY(®) fatty acid analogues to label a primary melanoma cell line (Me501) that highly and spontaneously secrete nanovesicles. Upon addition to cell culture, BODIPY fatty acids are rapidly incorporated into major phospholipid classes ultimately producing fluorescent Exo as direct result of biogenesis. Our metabolic labeling protocol produced bright fluorescent Exo that can be examined and quantified with conventional non-customized flow cytometry (FC) instruments by exploiting their fluorescent emission rather than light-scattering detection. Furthermore, our methodology permits the measurement of single Exo-associated fluorescence transfer to cells making quantitative the correlation between Exo uptake and activation of cellular processes. Thus the protocol presented here appears as an appropriate tool to who wants to investigate mechanisms of Exo functions in that it allows for direct and rapid characterization and quantification of fluorescent Exo number, intensity, size, and eventually evaluation of their kinetic of uptake/secretion in target cells. PMID:27317184

  2. Accurate quantification of supercoiled DNA by digital PCR

    PubMed Central

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-01-01

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry. PMID:27063649

  3. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  4. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  5. Quantification of nerve agent biomarkers in human serum and urine.

    PubMed

    Røen, Bent Tore; Sellevåg, Stig Rune; Lundanes, Elsa

    2014-12-01

    A novel method for rapid and sensitive quantification of the nerve agent metabolites ethyl, isopropyl, isobutyl, cyclohexyl, and pinacolyl methylphosphonic acid has been established by combining salting-out assisted liquid-liquid extraction (SALLE) and online solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS/MS). The procedure allows confirmation of nerve agent exposure within 30 min from receiving a sample, with very low detection limits for the biomarkers of 0.04-0.12 ng/mL. Sample preparation by SALLE was performed in less than 10 min, with a common procedure for both serum and urine. Analyte recoveries of 70-100% were obtained using tetrahydrofuran as extraction solvent and Na2SO4 to achieve phase separation. After SALLE, selective analyte retention was obtained on a ZrO2 column by Lewis acid-base and hydrophilic interactions with acetonitrile/1% CH3COOH (82/18) as the loading mobile phase. The phosphonic acids were backflush-desorbed onto a polymeric zwitterionic column at pH 9.8 and separated by hydrophilic interaction liquid chromatography. The method was linear (R(2) ≥ 0.995) from the limits of quantification to 50 ng/mL, and the within- and between-assay repeatability at 20 ng/mL were below 5% and 10% relative standard deviation, respectively. PMID:25371246

  6. Simple and Inexpensive Quantification of Ammonia in Whole Blood

    PubMed Central

    Ayyub, Omar B.; Behrens, Adam M.; Heligman, Brian T.; Natoli, Mary E.; Ayoub, Joseph J.; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μl of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p=0.0001. PMID:25936660

  7. Quantification of Methylated Selenium, Sulfur, and Arsenic in the Environment

    PubMed Central

    Vriens, Bas; Ammann, Adrian A.; Hagendorfer, Harald; Lenz, Markus; Berg, Michael; Winkel, Lenny H. E.

    2014-01-01

    Biomethylation and volatilization of trace elements may contribute to their redistribution in the environment. However, quantification of volatile, methylated species in the environment is complicated by a lack of straightforward and field-deployable air sampling methods that preserve element speciation. This paper presents a robust and versatile gas trapping method for the simultaneous preconcentration of volatile selenium (Se), sulfur (S), and arsenic (As) species. Using HPLC-HR-ICP-MS and ESI-MS/MS analyses, we demonstrate that volatile Se and S species efficiently transform into specific non-volatile compounds during trapping, which enables the deduction of the original gaseous speciation. With minor adaptations, the presented HPLC-HR-ICP-MS method also allows for the quantification of 13 non-volatile methylated species and oxyanions of Se, S, and As in natural waters. Application of these methods in a peatland indicated that, at the selected sites, fluxes varied between 190–210 ng Se·m−2·d−1, 90–270 ng As·m−2·d−1, and 4–14 µg S·m−2·d−1, and contained at least 70% methylated Se and S species. In the surface water, methylated species were particularly abundant for As (>50% of total As). Our results indicate that methylation plays a significant role in the biogeochemical cycles of these elements. PMID:25047128

  8. Selected Reaction Monitoring Mass Spectrometry for Absolute Protein Quantification.

    PubMed

    Manes, Nathan P; Mann, Jessica M; Nita-Lazar, Aleksandra

    2015-01-01

    Absolute quantification of target proteins within complex biological samples is critical to a wide range of research and clinical applications. This protocol provides step-by-step instructions for the development and application of quantitative assays using selected reaction monitoring (SRM) mass spectrometry (MS). First, likely quantotypic target peptides are identified based on numerous criteria. This includes identifying proteotypic peptides, avoiding sites of posttranslational modification, and analyzing the uniqueness of the target peptide to the target protein. Next, crude external peptide standards are synthesized and used to develop SRM assays, and the resulting assays are used to perform qualitative analyses of the biological samples. Finally, purified, quantified, heavy isotope labeled internal peptide standards are prepared and used to perform isotope dilution series SRM assays. Analysis of all of the resulting MS data is presented. This protocol was used to accurately assay the absolute abundance of proteins of the chemotaxis signaling pathway within RAW 264.7 cells (a mouse monocyte/macrophage cell line). The quantification of Gi2 (a heterotrimeric G-protein α-subunit) is described in detail. PMID:26325288

  9. Applying uncertainty quantification to multiphase flow computational fluid dynamics

    SciTech Connect

    Gel, A; Garg, R; Tong, C; Shahnam, M; Guenther, C

    2013-07-01

    Multiphase computational fluid dynamics plays a major role in design and optimization of fossil fuel based reactors. There is a growing interest in accounting for the influence of uncertainties associated with physical systems to increase the reliability of computational simulation based engineering analysis. The U.S. Department of Energy's National Energy Technology Laboratory (NETL) has recently undertaken an initiative to characterize uncertainties associated with computer simulation of reacting multiphase flows encountered in energy producing systems such as a coal gasifier. The current work presents the preliminary results in applying non-intrusive parametric uncertainty quantification and propagation techniques with NETL's open-source multiphase computational fluid dynamics software MFIX. For this purpose an open-source uncertainty quantification toolkit, PSUADE developed at the Lawrence Livermore National Laboratory (LLNL) has been interfaced with MFIX software. In this study, the sources of uncertainty associated with numerical approximation and model form have been neglected, and only the model input parametric uncertainty with forward propagation has been investigated by constructing a surrogate model based on data-fitted response surface for a multiphase flow demonstration problem. Monte Carlo simulation was employed for forward propagation of the aleatory type input uncertainties. Several insights gained based on the outcome of these simulations are presented such as how inadequate characterization of uncertainties can affect the reliability of the prediction results. Also a global sensitivity study using Sobol' indices was performed to better understand the contribution of input parameters to the variability observed in response variable.

  10. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI

    NASA Astrophysics Data System (ADS)

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-06-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  11. Dimensionality reduction for uncertainty quantification of nuclear engineering models.

    SciTech Connect

    Roderick, O.; Wang, Z.; Anitescu, M.

    2011-01-01

    The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and efficiency of engineering projects. In our previous work, we have shown that the effect of uncertainty can be approximated by polynomial regression with derivatives (PRD): a hybrid regression method that uses first-order derivatives of the model output as additional fitting conditions for a polynomial expansion. Numerical experiments have demonstrated the advantage of this approach over classical methods of uncertainty analysis: in precision, computational efficiency, or both. To obtain derivatives, we used automatic differentiation (AD) on the simulation code; hand-coded derivatives are acceptable for simpler models. We now present improvements on the method. We use a tuned version of the method of snapshots, a technique based on proper orthogonal decomposition (POD), to set up the reduced order representation of essential information on uncertainty in the model inputs. The automatically obtained sensitivity information is required to set up the method. Dimensionality reduction in combination with PRD allows analysis on a larger dimension of the uncertainty space (>100), at modest computational cost.

  12. 43 CFR 11.73 - Quantification phase-resource recoverability analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 1 2013-10-01 2013-10-01 false Quantification phase-resource recoverability analysis. 11.73 Section 11.73 Public Lands: Interior Office of the Secretary of the Interior NATURAL RESOURCE DAMAGE ASSESSMENTS Type B Procedures § 11.73 Quantification phase—resource recoverability analysis. (a) Requirement. The time needed...

  13. 21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Procedure for announcing analytical methods for...-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a) FDA may issue an order announcing a specific analytical method or methods for the quantification...

  14. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  15. The quantification of hydrogen and methane in contaminated groundwater: validation of robust procedures for sampling and quantification.

    PubMed

    Dorgerloh, Ute; Becker, Roland; Theissen, Hubert; Nehls, Irene

    2010-10-01

    A number of currently recommended sampling techniques for the determination of hydrogen in contaminated groundwater were compared regarding the practical proficiency in field campaigns. Key characteristics of appropriate sampling procedures are reproducibility of results, robustness against varying field conditions such as hydrostatic pressure, aquifer flow, and biological activity. Laboratory set-ups were used to investigate the most promising techniques. Bubble stripping with gas sampling bulbs yielded reproducible recovery of hydrogen and methane which could be verified for groundwater sampled in two field campaigns. The methane content of the groundwater was confirmed by analysis of directly pumped samples thus supporting the trueness of the stripping results. Laboratory set-ups and field campaigns revealed that bubble stripping of hydrogen may be restricted to the type of used pump. Concentrations of dissolved hydrogen after bubble stripping with an electrically driven submersible pump were about one order of magnitude higher than those obtained from diffusion sampling. The gas chromatographic determination for hydrogen and methane requires manual injection of gas samples and detection by a pulsed discharge detector (PDD) and allows limits of quantification of 3 nM dissolved hydrogen and 1 µg L⁻¹ dissolved methane in groundwater. The combined standard uncertainty of the bubble stripping and GC/PDD quantification of hydrogen in field samples was 7% at 7.8 nM and 18% for 78 nM. PMID:20730246

  16. Proteomics technologies for the global identification and quantification of proteins.

    PubMed

    Brewis, Ian A; Brennan, P

    2010-01-01

    This review provides an introduction for the nonspecialist to proteomics and in particular the major approaches available for global protein identification and quantification. Proteomics technologies offer considerable opportunities for improved biological understanding and biomarker discovery. The central platform for proteomics is tandem mass spectrometry (MS) but a number of other technologies, resources, and expertise are absolutely required to perform meaningful experiments. These include protein separation science (and protein biochemistry in general), genomics, and bioinformatics. There are a range of workflows available for protein (or peptide) separation prior to tandem MS and subsequent bioinformatics analysis to achieve protein identifications. The predominant approaches are 2D electrophoresis (2DE) and subsequent MS, liquid chromatography-MS (LC-MS), and GeLC-MS. Beyond protein identification, there are a number of well-established options available for protein quantification. Difference gel electrophoresis (DIGE) following 2DE is one option but MS-based methods (most commonly iTRAQ-Isobaric Tags for Relative and Absolute Quantification or SILAC-Stable Isotope Labeling by Amino Acids) are now the preferred options. Sample preparation is critical to performing good experiments and subcellular fractionation can additionally provide protein localization information compared with whole cell lysates. Differential detergent solubilization is another valid option. With biological fluids, it is possible to remove the most abundant proteins by immunodepletion. Sample enrichment is also used extensively in certain analyses and most commonly in phosphoproteomics with the initial purification of phosphopeptides. Proteomics produces considerable datasets and resources to facilitate the necessary extended analysis of this data are improving all the time. Beyond the opportunities afforded by proteomics there are definite challenges to achieving full proteomic coverage

  17. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  18. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  19. Improved semiquantitative Western blot technique with increased quantification range.

    PubMed

    Heidebrecht, F; Heidebrecht, A; Schulz, I; Behrens, S-E; Bader, A

    2009-06-30

    With the development of new interdisciplinary fields such as systems biology, the quantitative analysis of protein expression in biological samples gains more and more importance. Although the most common method for this is ELISA, Western blot also has advantages: The separation of proteins by size allows the evaluation of only specifically bound protein. This work examines the Western blot signal chain, determines some of the parameters relevant for quantitative analysis and proposes a mathematical model of the reaction kinetics. Using this model, a semiquantitative Western blot method for simultaneous quantification of different proteins using a hyperbolic calibration curve was developed. A program was written for the purpose of hyperbolic regression that allows quick determination of the calibration curve coefficients. This program can be used also for approximation of calibration curves in other applications such as ELISA, BCA or Bradford assays. PMID:19351538

  20. Technological and Analytical Methods for Arabinoxylan Quantification from Cereals.

    PubMed

    Döring, Clemens; Jekle, Mario; Becker, Thomas

    2016-04-25

    Arabinoxylan (AX) is the major nonstarch polysaccharide contained in various types of grains. AX consists of a backbone of β1.4D-xylopyranosyl residues with randomly linked αlarabinofuranosyl units. Once isolated and included as food additive, AX affects foodstuff attributes and has positive effects on human health. AX can be classified into waterextractable and waterunextractable AX. For isolating AX out of their natural matrix, a range of methods was developed, adapted, and improved. This review presents a survey of the commonly used extraction methods for AX by the influence of different techniques. It also provides a brief overview of the structural and technological impact of AX as a dough additive. A concluding section summarizes different detection methods for analyzing and quantification AX. PMID:25629383

  1. In situ quantification and visualization of lithium transport with neutrons.

    PubMed

    Liu, Danny X; Wang, Jinghui; Pan, Ke; Qiu, Jie; Canova, Marcello; Cao, Lei R; Co, Anne C

    2014-09-01

    A real-time quantification of Li transport using a nondestructive neutron method to measure the Li distribution upon charge and discharge in a Li-ion cell is reported. By using in situ neutron depth profiling (NDP), we probed the onset of lithiation in a high-capacity Sn anode and visualized the enrichment of Li atoms on the surface followed by their propagation into the bulk. The delithiation process shows the removal of Li near the surface, which leads to a decreased coulombic efficiency, likely because of trapped Li within the intermetallic material. The developed in situ NDP provides exceptional sensitivity in the temporal and spatial measurement of Li transport within the battery material. This diagnostic tool opens up possibilities to understand rates of Li transport and their distribution to guide materials development for efficient storage mechanisms. Our observations provide important mechanistic insights for the design of advanced battery materials. PMID:25044527

  2. Multichannel quantification of biomedical magnetic resonance spectroscopic signals

    NASA Astrophysics Data System (ADS)

    Vanhamme, Leen; Van Huffel, Sabine

    1998-10-01

    Quantification of individual magnetic resonance spectroscopy (MRS) signals modeled as a sum of exponentially damped sinusoids, is possible using interactive nonlinear least-squares fitting methods which provide maximum likelihood parameter estimates or using fully automatic, but statistically suboptical black-box methods. In kinetic experiments consecutive time series of MRS spectra are measured in which some of the parameters are known to remain constant over time. The purpose of this paper is to show how the previously mentioned methods can be extended to the simultaneous processing of all spectra in the time series using this additional information between the spectra. We will show that this approach yields statistically better results than processing the different signals separately.

  3. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  4. Experimental investigations for uncertainty quantification in brake squeal analysis

    NASA Astrophysics Data System (ADS)

    Renault, A.; Massa, F.; Lallemand, B.; Tison, T.

    2016-04-01

    The aim of this paper is to improve the correlation between the experimental and the numerical prediction of unstable frequencies for automotive brake systems considering uncertainty. First, an experimental quantification of uncertainty and a discussion analysing the contributions of uncertainty to a numerical squeal simulation are proposed. Frequency and transient simulations are performed considering nominal values of model parameters, determined experimentally. The obtained results are compared with those derived from experimental tests to highlight the limitation of deterministic simulations. The effects of the different kinds of uncertainty detected in working conditions of brake system, the pad boundary condition, the brake system material properties and the pad surface topography are discussed by defining different unstable mode classes. Finally, a correlation between experimental and numerical results considering uncertainty is successfully proposed for an industrial brake system. Results from the different comparisons reveal also a major influence of the pad topography and consequently the contact distribution.

  5. Epidermal Nerve Fiber Quantification in the Assessment of Diabetic Neuropathy

    PubMed Central

    Beiswenger, Kristina K.; Calcutt, Nigel A.; Mizisin, Andrew P.

    2008-01-01

    Summary Assessment of cutaneous innervation in skin biopsies is emerging as a valuable means of both diagnosing and staging diabetic neuropathy. Immunolabeling, using antibodies to neuronal proteins such as protein gene product 9.5, allows for the visualization and quantification of intraepidermal nerve fibers. Multiple studies have shown reductions in intraepidermal nerve fiber density in skin biopsies from patients with both type 1 and type 2 diabetes. More recent studies have focused on correlating these changes with other measures of diabetic neuropathy. A loss of epidermal innervation similar to that observed in diabetic patients has been observed in rodent models of both type 1 and type 2 diabetes and several therapeutics have been reported to prevent reductions in intraepidermal nerve fiber density in these models. This review discusses the current literature describing diabetes-induced changes in cutaneous innervation in both human and animal models of diabetic neuropathy. PMID:18384843

  6. Visual quantification of embolism reveals leaf vulnerability to hydraulic failure.

    PubMed

    Brodribb, Timothy J; Skelton, Robert P; McAdam, Scott A M; Bienaimé, Diane; Lucani, Christopher J; Marmottant, Philippe

    2016-03-01

    Vascular plant mortality during drought has been strongly linked to a failure of the internal water transport system caused by the rapid invasion of air and subsequent blockage of xylem conduits. Quantification of this critical process is greatly complicated by the existence of high water tension in xylem cells making them prone to embolism during experimental manipulation. Here we describe a simple new optical method that can be used to record spatial and temporal patterns of embolism formation in the veins of water-stressed leaves for the first time. Applying this technique in four diverse angiosperm species we found very strong agreement between the dynamics of embolism formation during desiccation and decline of leaf hydraulic conductance. These data connect the failure of the leaf water transport network under drought stress to embolism formation in the leaf xylem, and suggest embolism occurs after stomatal closure under extreme water stress. PMID:26742653

  7. Image reconstruction with uncertainty quantification in photoacoustic tomography.

    PubMed

    Tick, Jenni; Pulkkinen, Aki; Tarvainen, Tanja

    2016-04-01

    Photoacoustic tomography is a hybrid imaging method that combines optical contrast and ultrasound resolution. The goal of photoacoustic tomography is to resolve an initial pressure distribution from detected ultrasound waves generated within an object due to an illumination of a short light pulse. In this work, a Bayesian approach to photoacoustic tomography is described. The solution of the inverse problem is derived and computation of the point estimates for image reconstruction and uncertainty quantification is described. The approach is investigated with simulations in different detector geometries, including limited view setup, and with different detector properties such as ideal point-like detectors, finite size detectors, and detectors with a finite bandwidth. The results show that the Bayesian approach can be used to provide accurate estimates of the initial pressure distribution, as well as information about the uncertainty of the estimates. PMID:27106341

  8. A surrogate accelerated multicanonical Monte Carlo method for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wu, Keyi; Li, Jinglai

    2016-09-01

    In this work we consider a class of uncertainty quantification problems where the system performance or reliability is characterized by a scalar parameter y. The performance parameter y is random due to the presence of various sources of uncertainty in the system, and our goal is to estimate the probability density function (PDF) of y. We propose to use the multicanonical Monte Carlo (MMC) method, a special type of adaptive importance sampling algorithms, to compute the PDF of interest. Moreover, we develop an adaptive algorithm to construct local Gaussian process surrogates to further accelerate the MMC iterations. With numerical examples we demonstrate that the proposed method can achieve several orders of magnitudes of speedup over the standard Monte Carlo methods.

  9. Quantification of Diffuse Hydrothermal Flows Using Multibeam Sonar

    NASA Astrophysics Data System (ADS)

    Ivakin, A. N.; Jackson, D. R.; Bemis, K. G.; Xu, G.

    2014-12-01

    The Cabled Observatory Vent Imaging Sonar (COVIS) deployed at the Main Endeavour node of the NEPTUNE Canada observatory has provided acoustic time series extending over 2 years. This includes 3D images of plume scattering strength and Doppler velocity measurements as well as 2D images showing regions of diffuse flow. The diffuse-flow images display the level of decorrelation between sonar echos with transmissions separated by 0.2 s. The present work aims to provide further information on the strength of diffuse flows. Two approaches are used: Measurement of the dependence of decorrelation on lag and measurement of phase shift of sonar echos, with lags in 3-hour increments up to several days. The phase shifts and decorrelation are linked to variations of temperature above the seabed, which allows quantification of those variations, their magnitudes, spatial and temporal scales, and energy spectra. These techniques are illustrated using COVIS data obtained near the Grotto vent complex.

  10. Portasystemic shunt fraction quantification with colonic iodine-123 iodoamphetamine

    SciTech Connect

    Yen, C.K.; Pollycove, M.; Crass, R.; Lin, T.H.; Baldwin, R.; Lamb, J.

    1986-08-01

    Portasystemic shunting was quantified in dogs with (/sup 123/I)iodoamphetamine (IMP) administered transrectally into the colon and monitored externally with a gamma camera. IMP was absorbed rapidly and unchanged from the colon. After direct injection into the portal vein, IMP was almost completely extracted by the liver on the first pass, and the washout half-life was approximately 60 min. Based on these kinetic data, computer simulation of this biologic system was carried out. Errors associated with simplified models are calculated. The simplest model with insignificant error, which assumed that the tracer behaved like microspheres, was used to quantitate portasystemic shunt fraction in animals with surgically created shunts. Results were compared with the standard of /sup 99m/Tc-labeled macroaggregated albumin infused into a branch of inferior mesenteric vein. For shunt fractions ranging from 0 to 100%, an excellent correlation was seen, indicating that this approach is potentially a simple, noninvasive method of portasystemic shunt fraction quantification.

  11. Uncertainty quantification of an inflatable/rigidizable torus

    NASA Astrophysics Data System (ADS)

    Lew, Jiann-Shiun; Horta, Lucas G.; Reaves, Mercedes C.

    2006-06-01

    There is an increasing interest in lightweight inflatable structures for space missions. The dynamic testing and model updating of these types of structures present many challenges in terms of model uncertainty and structural nonlinearity. This paper presents an experimental study of uncertainty quantification of a 3m-diameter inflatable torus. Model uncertainty can be thought of as coming from two different sources, uncertainty due to changes in controlled conditions, such as temperature and input force level, and uncertainty associated with others random factors, such as measurement noise, etc. To precisely investigate and quantify model uncertainty from different sources, experiments, using sine-sweep excitation in the specified narrow frequency bands, are conducted to collect frequency response function (FRF) under various test conditions. To model the variation of the identified parameters, a singular value decomposition technique is applied to extract the principal components of the parameter change.

  12. Quantification of intracerebral steal in patients with arteriovenous malformation

    SciTech Connect

    Homan, R.W.; Devous, M.D. Sr.; Stokely, E.M.; Bonte, F.J.

    1986-08-01

    Eleven patients with angiographically and/or pathologically proved arteriovenous malformations (AVMs) were studied using dynamic, single-photon-emission computed tomography (DSPECT). Quantification of regional cerebral blood flow in structurally normal areas remote from the AVM disclosed areas of decreased flow compared with normal controls in eight of 11 patients examined. Areas of hypoperfusion correlated with altered function as manifested by epileptogenic foci and impaired cognitive function. Dynamic, single-photon-emission computed tomography provides a noninvasive technique to monitor quantitatively hemodynamic changes associated with AVMs. Our findings suggest that such changes are present in the majority of patients with AVMs and that they may be clinically significant. The potential application of regional cerebral blood flow imaging by DSPECT in the management of patients with AVMs is discussed.

  13. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  14. Segmentation and quantification of adipose tissue by magnetic resonance imaging.

    PubMed

    Hu, Houchun Harry; Chen, Jun; Shen, Wei

    2016-04-01

    In this brief review, introductory concepts in animal and human adipose tissue segmentation using proton magnetic resonance imaging (MRI) and computed tomography are summarized in the context of obesity research. Adipose tissue segmentation and quantification using spin relaxation-based (e.g., T1-weighted, T2-weighted), relaxometry-based (e.g., T1-, T2-, T2*-mapping), chemical-shift selective, and chemical-shift encoded water-fat MRI pulse sequences are briefly discussed. The continuing interest to classify subcutaneous and visceral adipose tissue depots into smaller sub-depot compartments is mentioned. The use of a single slice, a stack of slices across a limited anatomical region, or a whole body protocol is considered. Common image post-processing steps and emerging atlas-based automated segmentation techniques are noted. Finally, the article identifies some directions of future research, including a discussion on the growing topic of brown adipose tissue and related segmentation considerations. PMID:26336839

  15. NeuCode Labels for Relative Protein Quantification *

    PubMed Central

    Merrill, Anna E.; Hebert, Alexander S.; MacGilvray, Matthew E.; Rose, Christopher M.; Bailey, Derek J.; Bradley, Joel C.; Wood, William W.; El Masri, Marwan; Westphall, Michael S.; Gasch, Audrey P.; Coon, Joshua J.

    2014-01-01

    We describe a synthesis strategy for the preparation of lysine isotopologues that differ in mass by as little as 6 mDa. We demonstrate that incorporation of these molecules into the proteomes of actively growing cells does not affect cellular proliferation, and we discuss how to use the embedded mass signatures (neutron encoding (NeuCode)) for multiplexed proteome quantification by means of high-resolution mass spectrometry. NeuCode SILAC amalgamates the quantitative accuracy of SILAC with the multiplexing of isobaric tags and, in doing so, offers up new opportunities for biological investigation. We applied NeuCode SILAC to examine the relationship between transcript and protein levels in yeast cells responding to environmental stress. Finally, we monitored the time-resolved responses of five signaling mutants in a single 18-plex experiment. PMID:24938287

  16. Quantification of tidal parameters from Solar System data

    NASA Astrophysics Data System (ADS)

    Lainey, Valéry

    2016-05-01

    Tidal dissipation is the main driver of orbital evolution of natural satellites and a key point to understand the exoplanetary system configurations. Despite its importance, its quantification from observations still remains difficult for most objects of our own Solar System. In this work, we overview the method that has been used to determine, directly from observations, the tidal parameters, with emphasis on the Love number k_2 and the tidal quality factor Q. Up-to-date values of these tidal parameters are summarized. Last, an assessment on the possible determination of the tidal ratio k_2/Q of Uranus and Neptune is done. This may be particularly relevant for coming astrometric campaigns and future space missions focused on these systems.

  17. A novel definition for quantification of mode shape complexity

    NASA Astrophysics Data System (ADS)

    Koruk, Hasan; Sanliturk, Kenan Y.

    2013-07-01

    Complex mode shapes are quite often encountered in structural dynamics. However, there is no universally accepted parameter for the quantification of mode shape complexity. After reviewing the existing methods, a novel approach is proposed in this paper in order to quantify mode shape complexity for general structures. The new parameter proposed in this paper is based on conservation of energy principle when a structure is vibrating at a specific mode during a period of vibration. The levels of complexity of the individual mode shapes of a sample structure are then quantified using the proposed new parameter and the other parameters available in the literature. The corresponding results are compared, the validity and the generality of the new parameter are demonstrated for various damping scenarios.

  18. Quantification of hydroxyacetone and glycolaldehyde using chemical ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    Spencer, K. M.; Beaver, M. R.; St. Clair, J. M.; Crounse, J. D.; Paulot, F.; Wennberg, P. O.

    2011-08-01

    Chemical ionization mass spectrometry (CIMS) enables online, fast, in situ detection and quantification of hydroxyacetone and glycolaldehyde. Two different CIMS approaches are demonstrated employing the strengths of single quadrupole mass spectrometry and triple quadrupole (tandem) mass spectrometry. Both methods are capable of the measurement of hydroxyacetone, an analyte with minimal isobaric interferences. Tandem mass spectrometry provides direct separation of the isobaric compounds glycolaldehyde and acetic acid using distinct, collision-induced dissociation daughter ions. Measurement of hydroxyacetone and glycolaldehyde by these methods was demonstrated during the ARCTAS-CARB 2008 campaign and the BEARPEX 2009 campaign. Enhancement ratios of these compounds in ambient biomass burning plumes are reported for the ARCTAS-CARB campaign. BEARPEX observations are compared to simple photochemical box model predictions of biogenic volatile organic compound oxidation at the site.

  19. From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches

    PubMed Central

    Potter, Kristin; Rosen, Paul; Johnson, Chris R.

    2014-01-01

    Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly used in visualization. We then look at various approaches to visualizing this uncertainty by partitioning the work based on the dimensionality of the data and the dimensionality of the uncertainty. We also discuss noteworthy exceptions to our taxonomy along with future research directions for the uncertainty visualization community. PMID:25663949

  20. Methods for the efficient quantification of fruit provitamin A contents.

    PubMed

    Davey, Mark W; Keulemans, Johan; Swennen, Rony

    2006-12-15

    As part of a screening program to identify micronutrient-rich banana and plantain (Musa) varieties, a simple, robust, and comparatively rapid protocol for the quantification of the provitamin A carotenoids contents of fruit pulp and peel tissues by HPLC and by spectrophotometry has been developed. Major points to note include the use lyophilisation and extensive tissue disruption procedures to ensure quantitative recoveries, and the avoidance of saponification and/or concentration steps which lead to significant losses of provitamin A carotenoids. The protocol showed excellent reproducibility between replicate extractions, without the need for an internal standard. Application of the methodology demonstrated that Musa fruit pulp has a relatively simple provitamin A carotenoids content, quite different from the overlying peel, and that the proportions of alpha- and beta-carotene are characteristic for each genotype. The protocol was also used to profile the provitamin A carotenoids of several other fruits. PMID:17049540

  1. Quantification of osteolytic bone lesions in a preclinical rat trial

    NASA Astrophysics Data System (ADS)

    Fränzle, Andrea; Bretschi, Maren; Bäuerle, Tobias; Giske, Kristina; Hillengass, Jens; Bendl, Rolf

    2013-10-01

    In breast cancer, most of the patients who died, have developed bone metastasis as disease progression. Bone metastases in case of breast cancer are mainly bone destructive (osteolytic). To understand pathogenesis and to analyse response to different treatments, animal models, in our case rats, are examined. For assessment of treatment response to bone remodelling therapies exact segmentations of osteolytic lesions are needed. Manual segmentations are not only time-consuming but lack in reproducibility. Computerized segmentation tools are essential. In this paper we present an approach for the computerized quantification of osteolytic lesion volumes using a comparison to a healthy reference model. The presented qualitative and quantitative evaluation of the reconstructed bone volumes show, that the automatically segmented lesion volumes complete missing bone in a reasonable way.

  2. [Demographic and epidemiological quantification in Balearic hygienism, 1850-1930].

    PubMed

    Pujadas-Mora, Joana-Maria

    2012-01-01

    At the end of the 19th century, social medicine promoted the use of quantification as a means to evaluate the health status of populations. In Majorca, hygienists such as the physicians Enric Fajarnés, Bernat Riera, Antoni Mayol and Emili Darder and the civil engineer Eusebi Estada sought a better understanding of health status by considering the population growth, the demographic and epidemiological profile and the influence of weather on mortality. These calculations showed that the Balearic population had a good health status in comparison to the population of mainland Spain, although less so in the international context. These results were explained by the benevolence of the insular climate, a factor that would also guarantee the success of the public health reforms proposed. PMID:22849220

  3. Thermostability of Biological Systems: Fundamentals, Challenges, and Quantification

    PubMed Central

    He, Xiaoming

    2011-01-01

    This review examines the fundamentals and challenges in engineering/understanding the thermostability of biological systems over a wide temperature range (from the cryogenic to hyperthermic regimen). Applications of the bio-thermostability engineering to either destroy unwanted or stabilize useful biologicals for the treatment of diseases in modern medicine are first introduced. Studies on the biological responses to cryogenic and hyperthermic temperatures for the various applications are reviewed to understand the mechanism of thermal (both cryo and hyperthermic) injury and its quantification at the molecular, cellular and tissue/organ levels. Methods for quantifying the thermophysical processes of the various applications are then summarized accounting for the effect of blood perfusion, metabolism, water transport across cell plasma membrane, and phase transition (both equilibrium and non-equilibrium such as ice formation and glass transition) of water. The review concludes with a summary of the status quo and future perspectives in engineering the thermostability of biological systems. PMID:21769301

  4. Aspect-Oriented Programming is Quantification and Obliviousness

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  5. Enhanced techniques for asymmetry quantification in brain imagery

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Imielinska, Celina; Rosiene, Joel; Connolly, E. S.; D'Ambrosio, Anthony L.

    2006-03-01

    We present an automated generic methodology for symmetry identification and asymmetry quantification, novel method of identifying and delineation of brain pathology by analyzing the opposing sides of the brain utilizing of inherent left-right symmetry in the brain. After symmetry axis has been detected, we apply non-parametric statistical tests operating on the pairs of samples to identify initial seeds points which is defined defined as the pixels where the most statistically significant difference appears. Local region growing is performed on the difference map, from where the seeds are aggregating until it captures all 8-way connected high signals from the difference map. We illustrate the capability of our method with examples ranging from tumors in patient MR data to animal stroke data. The validation results on Rat stroke data have shown that this approach has promise to achieve high precision and full automation in segmenting lesions in reflectional symmetrical objects.

  6. Quantification of airway deposition of intact and fragmented pollens.

    PubMed

    Horváth, Alpár; Balásházy, Imre; Farkas, Arpád; Sárkány, Zoltán; Hofmann, Werner; Czitrovszky, Aladár; Dobos, Erik

    2011-12-01

    Although pollen is one of the most widespread agents that can cause allergy, its airway transport and deposition is far from being fully explored. The objective of this study was to characterize the airway deposition of pollens and to contribute to the debate related to the increasing number of asthma attacks registered after thunderstorms. For the quantification of the deposition of inhaled pollens in the airways computer simulations were performed. Our results demonstrated that smaller and fragmented pollens may penetrate into the thoracic airways and deposit there, supporting the theory that fragmented pollen particles are responsible for the increasing incidence of asthma attacks following thunderstorms. Pollen deposition results also suggest that children are the most exposed to the allergic effects of pollens. Finally, pollens between 0.5 and 20 μm deposit more efficiently in the lung of asthmatics than in the healthy lung, especially in the bronchial region. PMID:21563012

  7. Quantification of HER family receptors in breast cancer.

    PubMed

    Nuciforo, Paolo; Radosevic-Robin, Nina; Ng, Tony; Scaltriti, Maurizio

    2015-01-01

    The clinical success of trastuzumab in breast cancer taught us that appropriate tumor evaluation is mandatory for the correct identification of patients eligible for targeted therapies. Although HER2 protein expression by immunohistochemistry (IHC) and gene amplification by fluorescence in situ hybridization (FISH) assays are routinely used to select patients to receive trastuzumab, both assays only partially predict response to the drug. In the case of epidermal growth factor receptor (EGFR), the link between the presence of the receptor or its amplification and response to anti-EGFR therapies could not be demonstrated. Even less is known for HER3 and HER4, mainly due to lack of robust and validated assays detecting these proteins. It is becoming evident that, besides FISH and IHC, we need better assays to quantify HER receptors and categorize the patients for individualized treatments. Here, we present the current available methodologies to measure HER family receptors and discuss the clinical implications of target quantification. PMID:25887735

  8. Dielectrophoretic immobilization of proteins: Quantification by atomic force microscopy.

    PubMed

    Laux, Eva-Maria; Knigge, Xenia; Bier, Frank F; Wenger, Christian; Hölzel, Ralph

    2015-09-01

    The combination of alternating electric fields with nanometer-sized electrodes allows the permanent immobilization of proteins by dielectrophoretic force. Here, atomic force microscopy is introduced as a quantification method, and results are compared with fluorescence microscopy. Experimental parameters, for example the applied voltage and duration of field application, are varied systematically, and the influence on the amount of immobilized proteins is investigated. A linear correlation to the duration of field application was found by atomic force microscopy, and both microscopical methods yield a square dependence of the amount of immobilized proteins on the applied voltage. While fluorescence microscopy allows real-time imaging, atomic force microscopy reveals immobilized proteins obscured in fluorescence images due to low S/N. Furthermore, the higher spatial resolution of the atomic force microscope enables the visualization of the protein distribution on single nanoelectrodes. The electric field distribution is calculated and compared to experimental results with very good agreement to atomic force microscopy measurements. PMID:26010162

  9. Graphene wrinkling induced by monodisperse nanoparticles: facile control and quantification

    PubMed Central

    Vejpravova, Jana; Pacakova, Barbara; Endres, Jan; Mantlikova, Alice; Verhagen, Tim; Vales, Vaclav; Frank, Otakar; Kalbac, Martin

    2015-01-01

    Controlled wrinkling of single-layer graphene (1-LG) at nanometer scale was achieved by introducing monodisperse nanoparticles (NPs), with size comparable to the strain coherence length, underneath the 1-LG. Typical fingerprint of the delaminated fraction is identified as substantial contribution to the principal Raman modes of the 1-LG (G and G’). Correlation analysis of the Raman shift of the G and G’ modes clearly resolved the 1-LG in contact and delaminated from the substrate, respectively. Intensity of Raman features of the delaminated 1-LG increases linearly with the amount of the wrinkles, as determined by advanced processing of atomic force microscopy data. Our study thus offers universal approach for both fine tuning and facile quantification of the graphene topography up to ~60% of wrinkling. PMID:26530787

  10. Quantification of asymmetric microtubule nucleation at sub-cellular structures

    PubMed Central

    Zhu, Xiaodong; Kaverina, Irina

    2012-01-01

    Cell polarization is important for multiple physiological processes. In polarized cells, microtubules (MTs) are organized into a spatially polarized array. Generally, in non-differentiated cells, it is assumed that MTs are symmetrically nucleated exclusively from centrosome (microtubule organizing center, MTOC) and then reorganized into the asymmetric array. We have recently identified the Golgi complex as an additional MTOC that asymmetrically nucleates MTs toward one side of the cell. Methods used for alternative MTOC identification include microtubule re-growth after complete drug-induced depolymerization and tracking of growing microtubules using fluorescence labeled MT +TIP binding proteins in living cells. These approaches can be used for quantification of MT nucleation sites at diverse sub-cellular structures. PMID:21773933

  11. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  12. Raman spectroscopy for DNA quantification in cell nucleus.

    PubMed

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate. PMID:25355529

  13. Carrageenan analysis. Part 3: Quantification in swine plasma.

    PubMed

    Blakemore, William R; Brant, Ashley F; Bissland, Jonathan G; Bissland, Natalie D

    2014-01-01

    Development and validation of this method was conducted to support a 28-day piglet feeding study of swine-adapted infant formulations stabilised with carrageenan. The validation was performed in accordance with USFDA Good Laboratory Practice (GLP) Regulations and associated current bioanalytical guidelines. Separation of carrageenan from plasma protein was unsuccessful using saturated sodium chloride due to the extremely strong cross-linking interactions between carrageenan and protein. Poligeenan is the deliberately acid-hydrolysed low molecular weight polygalactan non-food product produced from carrageenan. Poligeenan molecules are nearly identical to carrageenan molecules with respect to molecular structure, the primary difference being molecular weight. These poligeenan molecules have similar molecular weight when compared with the lowest molecular weight fraction of carrageenan called the low molecular-weight tail (LMT). Poligeenan was separated from plasma protein using the salting procedure, this being due to the significantly weaker interaction with protein caused by its shorter molecular chain length. Thus, poligeenan was applied as a chemical analyte surrogate for the LMT of carrageenan solely for the development and validation of the method. This method was used to try to detect the LMT of the carrageenan test material during the 28-day piglet feeding study, and if such was absorbed into the bloodstream. Successful development and validation of the method was achieved using LC-MS/MS coupled with ESI in negative-ion mode. A standard curve of instrument response versus poligeenan concentration was developed using swine plasma spiked with a range of poligeenan concentrations. The lower level of quantification (LLOQ) of poligeenan was 10.0 µg ml⁻¹, and the quantification range was 10.0-100.0 µg ml⁻¹. No animals were fed poligeenan. PMID:25164307

  14. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  15. Quantification of Blood Flow and Topology in Developing Vascular Networks

    PubMed Central

    Kloosterman, Astrid; Hierck, Beerend; Westerweel, Jerry; Poelma, Christian

    2014-01-01

    Since fluid dynamics plays a critical role in vascular remodeling, quantification of the hemodynamics is crucial to gain more insight into this complex process. Better understanding of vascular development can improve prediction of the process, and may eventually even be used to influence the vascular structure. In this study, a methodology to quantify hemodynamics and network structure of developing vascular networks is described. The hemodynamic parameters and topology are derived from detailed local blood flow velocities, obtained by in vivo micro-PIV measurements. The use of such detailed flow measurements is shown to be essential, as blood vessels with a similar diameter can have a large variation in flow rate. Measurements are performed in the yolk sacs of seven chicken embryos at two developmental stages between HH 13+ and 17+. A large range of flow velocities (1 µm/s to 1 mm/s) is measured in blood vessels with diameters in the range of 25–500 µm. The quality of the data sets is investigated by verifying the flow balances in the branching points. This shows that the quality of the data sets of the seven embryos is comparable for all stages observed, and the data is suitable for further analysis with known accuracy. When comparing two subsequently characterized networks of the same embryo, vascular remodeling is observed in all seven networks. However, the character of remodeling in the seven embryos differs and can be non-intuitive, which confirms the necessity of quantification. To illustrate the potential of the data, we present a preliminary quantitative study of key network topology parameters and we compare these with theoretical design rules. PMID:24823933

  16. Quantification of surface emissions: An historical perspective from GEIA

    NASA Astrophysics Data System (ADS)

    Granier, C.; Denier Van Der Gon, H.; Doumbia, E. H. T.; Frost, G. J.; Guenther, A. B.; Hassler, B.; Janssens-Maenhout, G. G. A.; Lasslop, G.; Melamed, M. L.; Middleton, P.; Sindelarova, K.; Tarrason, L.; van Marle, M.; W Kaiser, J.; van der Werf, G.

    2015-12-01

    Assessments of the composition of the atmosphere and its evolution require accurate knowledge of the surface emissions of atmospheric compounds. The first community development of global surface emissions started in 1990, when GEIA was established as a component of the International Global Atmospheric Chemistry (IGAC) project. At that time, GEIA meant "Global Emissions Inventory Activity". Since its inception, GEIA has brought together people to understand emissions from anthropogenic, biomass burning and natural sources. The first goal of GEIA was to establish a "best" inventory for the base year 1985 at 1x1 degree resolution. Since then many inventories have been developed by various groups at the global and regional scale at different temporal and spatial resolutions. GEIA, which now means the "Global Emissions Initiative", has evolved into assessing, harmonizing and distributing emissions datasets. We will review the main achievements of GEIA, and show how the development and evaluation of surface emissions has evolved during the last 25 years. We will discuss the use of surface, in-situ and remote sensing observations to evaluate and improve the quantification of emissions. We will highlight the main uncertainties currently limiting emissions datasets, such as the spatial and temporal evolution of emissions at different resolutions, the quantification of emerging emission sources (such as oil/gas extraction and distribution, biofuels, etc.), the speciation of the emissions of volatile organic compounds and of particulate matter, the capacity building necessary for organizing the development of regional emissions across the world, emissions from shipping, etc. We will present the ECCAD (Emissions of Atmospheric Compounds and Compilation of Ancillary Data) database, developed as part of GEIA to facilitate the access and evaluation of emission inventories.

  17. Stochastic methods for uncertainty quantification in radiation transport

    SciTech Connect

    Fichtl, Erin D; Prinja, Anil K; Warsa, James S

    2009-01-01

    The use of generalized polynomial chaos (gPC) expansions is investigated for uncertainty quantification in radiation transport. The gPC represents second-order random processes in terms of an expansion of orthogonal polynomials of random variables and is used to represent the uncertain input(s) and unknown(s). We assume a single uncertain input-the total macroscopic cross section-although this does not represent a limitation of the approaches considered here. Two solution methods are examined: The Stochastic Finite Element Method (SFEM) and the Stochastic Collocation Method (SCM). The SFEM entails taking Galerkin projections onto the orthogonal basis, which, for fixed source problems, yields a linear system of fully -coupled equations for the PC coefficients of the unknown. For k-eigenvalue calculations, the SFEM system is non-linear and a Newton-Krylov method is employed to solve it. The SCM utilizes a suitable quadrature rule to compute the moments or PC coefficients of the unknown(s), thus the SCM solution involves a series of independent deterministic transport solutions. The accuracy and efficiency of the two methods are compared and contrasted. The PC coefficients are used to compute the moments and probability density functions of the unknown(s), which are shown to be accurate by comparing with Monte Carlo results. Our work demonstrates that stochastic spectral expansions are a viable alternative to sampling-based uncertainty quantification techniques since both provide a complete characterization of the distribution of the flux and the k-eigenvalue. Furthermore, it is demonstrated that, unlike perturbation methods, SFEM and SCM can handle large parameter uncertainty.

  18. Neurostereology protocol for unbiased quantification of neuronal injury and neurodegeneration

    PubMed Central

    Golub, Victoria M.; Brewer, Jonathan; Wu, Xin; Kuruba, Ramkumar; Short, Jenessa; Manchi, Maunica; Swonke, Megan; Younus, Iyan; Reddy, Doodipala Samba

    2015-01-01

    Neuronal injury and neurodegeneration are the hallmark pathologies in a variety of neurological conditions such as epilepsy, stroke, traumatic brain injury, Parkinson’s disease and Alzheimer’s disease. Quantification of absolute neuron and interneuron counts in various brain regions is essential to understand the impact of neurological insults or neurodegenerative disease progression in animal models. However, conventional qualitative scoring-based protocols are superficial and less reliable for use in studies of neuroprotection evaluations. Here, we describe an optimized stereology protocol for quantification of neuronal injury and neurodegeneration by unbiased counting of neurons and interneurons. Every 20th section in each series of 20 sections was processed for NeuN(+) total neuron and parvalbumin(+) interneuron immunostaining. The sections that contain the hippocampus were then delineated into five reliably predefined subregions. Each region was separately analyzed with a microscope driven by the stereology software. Regional tissue volume was determined by using the Cavalieri estimator, as well as cell density and cell number were determined by using the optical disector and optical fractionator. This protocol yielded an estimate of 1.5 million total neurons and 0.05 million PV(+) interneurons within the rat hippocampus. The protocol has greater predictive power for absolute counts as it is based on 3D features rather than 2D images. The total neuron counts were consistent with literature values from sophisticated systems, which are more expensive than our stereology system. This unbiased stereology protocol allows for sensitive, medium-throughput counting of total neurons in any brain region, and thus provides a quantitative tool for studies of neuronal injury and neurodegeneration in a variety of acute brain injury and chronic neurological models. PMID:26582988

  19. The applications of statistical quantification techniques in nanomechanics and nanoelectronics

    NASA Astrophysics Data System (ADS)

    Mai, Wenjie; Deng, Xinwei

    2010-10-01

    Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f0. The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 ± 0.0274 × 10 - 5 ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.

  20. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  1. Quantification of the genetic risk of environmental mutagens

    SciTech Connect

    Ehling, U.H.

    1988-03-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens.

  2. Respiratory Mucosal Proteome Quantification in Human Influenza Infections

    PubMed Central

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G.; DeVincenzo, John P.; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 28 and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection. PMID:27088501

  3. Rapid quantification method for Legionella pneumophila in surface water.

    PubMed

    Wunderlich, Anika; Torggler, Carmen; Elsässer, Dennis; Lück, Christian; Niessner, Reinhard; Seidel, Michael

    2016-03-01

    World-wide legionellosis outbreaks caused by evaporative cooling systems have shown that there is a need for rapid screening methods for Legionella pneumophila in water. Antibody-based methods for the quantification of L. pneumophila are rapid, non-laborious, and relatively cheap but not sensitive enough for establishment as a screening method for surface and drinking water. Therefore, preconcentration methods have to be applied in advance to reach the needed sensitivity. In a basic test, monolithic adsorption filtration (MAF) was used as primary preconcentration method that adsorbs L. pneumophila with high efficiency. Ten-liter water samples were concentrated in 10 min and further reduced to 1 mL by centrifugal ultrafiltration (CeUF). The quantification of L. pneumophila strains belonging to the monoclonal subtype Bellingham was performed via flow-based chemiluminescence sandwich microarray immunoassays (CL-SMIA) in 36 min. The whole analysis process takes 90 min. A polyclonal antibody (pAb) against L. pneumophila serogroup 1-12 and a monoclonal antibody (mAb) against L. pneumophila SG 1 strain Bellingham were immobilized on a microarray chip. Without preconcentration, the detection limit was 4.0 × 10(3) and 2.8 × 10(3) CFU/mL determined by pAb and mAb 10/6, respectively. For samples processed by MAF-CeUF prior to SMIA detection, the limit of detection (LOD) could be decreased to 8.7 CFU/mL and 0.39 CFU/mL, respectively. A recovery of 99.8 ± 15.9% was achieved for concentrations between 1-1000 CFU/mL. The established combined analytical method is sensitive for rapid screening of surface and drinking water to allow fast hygiene control of L. pneumophila. PMID:26873217

  4. Extracellular polymeric substances: quantification and use in erosion experiments

    NASA Astrophysics Data System (ADS)

    Perkins, R. G.; Paterson, D. M.; Sun, H.; Watson, J.; Player, M. A.

    2004-10-01

    Extracellular polymeric substances (EPS) is a generic term often applied to high molecular weight polymers implicated in the biostabilisation of natural sediments. Quantitative analysis of in situ EPS production rates and sediment contents has usually involved extraction of EPS in saline media prior to precipitation in alcohol and quantification against a glucose standard (phenol-sulphuric acid assay). Extracted and synthetic EPS has also been used to create engineered sediments for erosion experiments. This study investigated two steps in the EPS extraction procedure, saline extraction and alcohol precipitation. Comparisons of the effects of different extracted polymers were made in sediment erosion experiments using engineered sediments. Sediment EPS content decreased as the salinity of the extractant increased, with highest values obtained for extraction in fresh water. Potential errors were observed in the quantification of the soluble colloidal polymer fraction when divided into EPS and lower molecular weight polymers (LMW) as used in many studies. In erosion studies, 15 mg kg-1 of alcohol (IMS) extracted EPS polymer (in 5 g kg-1 IMS precipitate, equivalent to approximately 5 g salt kg-1 sediment dry weight) decreased the erosion threshold of cohesive sediments whereas 30 mg kg-1 (in 10 g kg-1 IMS precipitate, approximately 10 g salt kg-1 sediment dry weight) had no effect compared to controls. This could be due to the influence of EPS on water content: low levels of EPS did not bind but prevented desiccation, lowering sediment stability against controls. At higher EPS content, binding effects balanced water content effects. Salt alone (at 10 g kg-1) slightly increased the erosion threshold after a 6-h desiccation period. In comparison, carbohydrates produced without alcohol precipitation (rotary evaporation) increased the erosion threshold at both 0.5 and 1.0 g EPS kg-1 dry weight of sediment. It was concluded that the role of microphytobenthic polymers in

  5. Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis

    PubMed Central

    Zhao, Lu; Chanon, Ann M.; Chattopadhyay, Nabanita; Dami, Imed E.; Blakeslee, Joshua J.

    2016-01-01

    Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography–mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars. PMID:27379118

  6. Optimal uncertainty quantification with model uncertainty and legacy data

    NASA Astrophysics Data System (ADS)

    Kamga, P.-H. T.; Li, B.; McKerns, M.; Nguyen, L. H.; Ortiz, M.; Owhadi, H.; Sullivan, T. J.

    2014-12-01

    We present an optimal uncertainty quantification (OUQ) protocol for systems that are characterized by an existing physics-based model and for which only legacy data is available, i.e., no additional experimental testing of the system is possible. Specifically, the OUQ strategy developed in this work consists of using the legacy data to establish, in a probabilistic sense, the level of error of the model, or modeling error, and to subsequently use the validated model as a basis for the determination of probabilities of outcomes. The quantification of modeling uncertainty specifically establishes, to a specified confidence, the probability that the actual response of the system lies within a certain distance of the model. Once the extent of model uncertainty has been established in this manner, the model can be conveniently used to stand in for the actual or empirical response of the system in order to compute probabilities of outcomes. To this end, we resort to the OUQ reduction theorem of Owhadi et al. (2013) in order to reduce the computation of optimal upper and lower bounds on probabilities of outcomes to a finite-dimensional optimization problem. We illustrate the resulting UQ protocol by means of an application concerned with the response to hypervelocity impact of 6061-T6 Aluminum plates by Nylon 6/6 impactors at impact velocities in the range of 5-7 km/s. The ability of the legacy OUQ protocol to process diverse information on the system and its ability to supply rigorous bounds on system performance under realistic-and less than ideal-scenarios demonstrated by the hypervelocity impact application is remarkable.

  7. Detection and quantification of Bacillus cereus group in milk by droplet digital PCR.

    PubMed

    Porcellato, Davide; Narvhus, Judith; Skeie, Siv Borghild

    2016-08-01

    Droplet digital PCR (ddPCR) is one of the newest and most promising methods for the detection and quantification of molecular targets by PCR. Here, we optimized and used a new ddPCR assay for the detection and quantification of the Bacillus cereus group in milk. We also compared the ddPCR to a standard qPCR assay. The new ddPCR assay showed a similar coefficient of determination and a better limit of detection compared to the qPCR assay during quantification of the target molecules in the samples. However, the ddPCR assay has a limitation during quantification of a high number of target molecules. This new assay was then tested for the quantification of the B. cereus group in 90 milk samples obtained over three months from two different dairies and the milk was stored at different temperatures before sampling. The ddPCR assay showed good agreement with the qPCR assay for the quantification of the B. cereus group in milk, and due to its lower detection limit more samples were detected as positive. The new ddPCR assay is a promising method for the quantification of target bacteria in low concentration in milk. PMID:27211508

  8. Absolute protein quantification of the yeast chaperome under conditions of heat shock

    PubMed Central

    Mackenzie, Rebecca J.; Lawless, Craig; Holman, Stephen W.; Lanthaler, Karin; Beynon, Robert J.; Grant, Chris M.; Hubbard, Simon J.

    2016-01-01

    Chaperones are fundamental to regulating the heat shock response, mediating protein recovery from thermal‐induced misfolding and aggregation. Using the QconCAT strategy and selected reaction monitoring (SRM) for absolute protein quantification, we have determined copy per cell values for 49 key chaperones in Saccharomyces cerevisiae under conditions of normal growth and heat shock. This work extends a previous chemostat quantification study by including up to five Q‐peptides per protein to improve confidence in protein quantification. In contrast to the global proteome profile of S. cerevisiae in response to heat shock, which remains largely unchanged as determined by label‐free quantification, many of the chaperones are upregulated with an average two‐fold increase in protein abundance. Interestingly, eight of the significantly upregulated chaperones are direct gene targets of heat shock transcription factor‐1. By performing absolute quantification of chaperones under heat stress for the first time, we were able to evaluate the individual protein‐level response. Furthermore, this SRM data was used to calibrate label‐free quantification values for the proteome in absolute terms, thus improving relative quantification between the two conditions. This study significantly enhances the largely transcriptomic data available in the field and illustrates a more nuanced response at the protein level. PMID:27252046

  9. Absolute protein quantification of the yeast chaperome under conditions of heat shock.

    PubMed

    Mackenzie, Rebecca J; Lawless, Craig; Holman, Stephen W; Lanthaler, Karin; Beynon, Robert J; Grant, Chris M; Hubbard, Simon J; Eyers, Claire E

    2016-08-01

    Chaperones are fundamental to regulating the heat shock response, mediating protein recovery from thermal-induced misfolding and aggregation. Using the QconCAT strategy and selected reaction monitoring (SRM) for absolute protein quantification, we have determined copy per cell values for 49 key chaperones in Saccharomyces cerevisiae under conditions of normal growth and heat shock. This work extends a previous chemostat quantification study by including up to five Q-peptides per protein to improve confidence in protein quantification. In contrast to the global proteome profile of S. cerevisiae in response to heat shock, which remains largely unchanged as determined by label-free quantification, many of the chaperones are upregulated with an average two-fold increase in protein abundance. Interestingly, eight of the significantly upregulated chaperones are direct gene targets of heat shock transcription factor-1. By performing absolute quantification of chaperones under heat stress for the first time, we were able to evaluate the individual protein-level response. Furthermore, this SRM data was used to calibrate label-free quantification values for the proteome in absolute terms, thus improving relative quantification between the two conditions. This study significantly enhances the largely transcriptomic data available in the field and illustrates a more nuanced response at the protein level. PMID:27252046

  10. A new objective method for acquisition and quantification of reflex receptive fields.

    PubMed

    Jensen, Michael Brun; Manresa, José Biurrun; Andersen, Ole Kæseler

    2015-03-01

    The nociceptive withdrawal reflex (NWR) is a polysynaptic spinal reflex correlated with pain perception. Assessment of this objective physiological measure constitutes the core of existing methods for quantification of reflex receptive fields (RRFs), which however still suffer from a certain degree of subjective involvement. This article proposes a strictly objective methodology for RRF quantification based on automated identification of NWR thresholds (NWR-Ts). Nociceptive withdrawal reflex thresholds were determined for 10 individual stimulation sites using an interleaved up-down staircase method. Reflexes were detected from electromyography by evaluation of interval peak z scores and application of conduction velocity analysis. Reflex receptive field areas were quantified from interpolated mappings of NWR-Ts and compared with existing RRF quantifications. A total of 3 repeated measures were performed in 2 different sessions to evaluate the test-retest reliability of the various quantifications, using coefficients of repeatability (CRs) and hypothetical sample sizes. The novel quantifications based on identification of NWR-Ts showed a similar level of reliability within and between sessions, whereas existing quantifications all demonstrated worse between-session than within-session reliability. The NWR-T-based quantifications required a smaller sample size than any of the existing RRF measures to detect a clinically relevant effect in a crossover study design involving more than 1 session. Of all measures, quantification from mapping of inversed NWR-Ts demonstrated superior reliability both within (CR, 0.25) and between sessions (CR, 0.28). The study presents a more reliable and robust quantification of the RRF to be used as biomarker of pain hypersensitivity in clinical and experimental research. PMID:25599237

  11. Comparison of colorimetric methods for the quantification of model proteins in aqueous two-phase systems.

    PubMed

    Glyk, Anna; Heinisch, Sandra L; Scheper, Thomas; Beutel, Sascha

    2015-05-15

    In the current study, the quantification of different model proteins in the presence of typical aqueous two-phase system components was investigated by using the Bradford and bicinchoninic acid (BCA) assays. Each phase-forming component above 1 and 5 wt% had considerable effects on the protein quantification in both assays, respectively, resulting in diminished protein recoveries/absorption values by increasing poly(ethylene glycol) (PEG)/salt concentration and PEG molecular weight. Therefore, a convenient dilution of both components (up to 1 and 5 wt%) before protein quantification is recommended in both assays, respectively, where the BCA assay is favored in comparison with the Bradford assay. PMID:25684109

  12. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    NASA Astrophysics Data System (ADS)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  13. Multimodality medical image fusion: probabilistic quantification, segmentation, and registration

    NASA Astrophysics Data System (ADS)

    Wang, Yue J.; Freedman, Matthew T.; Xuan, Jian Hua; Zheng, Qinfen; Mun, Seong K.

    1998-06-01

    Multimodality medical image fusion is becoming increasingly important in clinical applications, which involves information processing, registration and visualization of interventional and/or diagnostic images obtained from different modalities. This work is to develop a multimodality medical image fusion technique through probabilistic quantification, segmentation, and registration, based on statistical data mapping, multiple feature correlation, and probabilistic mean ergodic theorems. The goal of image fusion is to geometrically align two or more image areas/volumes so that pixels/voxels representing the same underlying anatomical structure can be superimposed meaningfully. Three steps are involved. To accurately extract the regions of interest, we developed the model supported Bayesian relaxation labeling, and edge detection and region growing integrated algorithms to segment the images into objects. After identifying the shift-invariant features (i.e., edge and region information), we provided an accurate and robust registration technique which is based on matching multiple binary feature images through a site model based image re-projection. The image was initially segmented into specified number of regions. A rough contour can be obtained by delineating and merging some of the segmented regions. We applied region growing and morphological filtering to extract the contour and get rid of some disconnected residual pixels after segmentation. The matching algorithm is implemented as follows: (1) the centroids of PET/CT and MR images are computed and then translated to the center of both images. (2) preliminary registration is performed first to determine an initial range of scaling factors and rotations, and the MR image is then resampled according to the specified parameters. (3) the total binary difference of the corresponding binary maps in both images is calculated for the selected registration parameters, and the final registration is achieved when the

  14. Coral Pigments: Quantification Using HPLC and Detection by Remote Sensing

    NASA Technical Reports Server (NTRS)

    Cottone, Mary C.

    1995-01-01

    Widespread coral bleaching (loss of pigments of symbiotic dinoflagellates), and the corresponding decline in coral reef health worldwide, mandates the monitoring of coral pigmentation. Samples of the corals Porites compressa and P. lobata were collected from a healthy reef at Puako, Hawaii, and chlorophyll (chl) a, peridinin, and Beta-carotene (Beta-car) were quantified using reverse-phase high performance liquid chromatography (HPLC). Detailed procedures are presented for the extraction of the coral pigments in 90% acetone, and the separation, identification, and quantification of the major zooxanthellar pigments using spectrophotometry and a modification of the HPLC system described by Mantoura and Llewellyn (1983). Beta-apo-8-carotenal was found to be inadequate as in internal standard, due to coelution with chl b and/or chl a allomer in the sample extracts. Improvements are suggested, which may result in better resolution of the major pigments and greater accuracy in quantification. Average concentrations of peridinin, chl a, and Beta-car in corals on the reef were 5.01, 8.59, and 0.29, micro-grams/cm(exp 2), respectively. Average concentrations of peridinin and Beta-car did not differ significantly between the two coral species sampled; however, the mean chl a concentration in P. compressa specimens (7.81 ,micro-grams/cm(exp 2) was significantly lower than that in P. lobata specimens (9.96 11g/cm2). Chl a concentrations determined spectrophotometrically were significantly higher than those generated through HPLC, suggesting that spectrophotometry overestimates chl a concentrations. The average ratio of chl a-to-peridinin concentrations was 1.90, with a large (53%) coefficient of variation and a significant difference between the two species sampled. Additional data are needed before conclusions can be drawn regarding average pigment concentrations in healthy corals and the consistency of the chl a/peridinin ratio. The HPLC pigment concentration values

  15. Development of hydrate risk quantification in oil and gas production

    NASA Astrophysics Data System (ADS)

    Chaudhari, Piyush N.

    order to reduce the parametric study that may require a long duration of time using The Colorado School of Mines Hydrate Kinetic Model (CSMHyK). The evolution of the hydrate plugging risk along flowline-riser systems is modeled for steady state and transient operations considering the effect of several critical parameters such as oil-hydrate slip, duration of shut-in, and water droplet size on a subsea tieback system. This research presents a novel platform for quantification of the hydrate plugging risk, which in-turn will play an important role in improving and optimizing current hydrate management strategies. The predictive strength of the hydrate risk quantification and hydrate prediction models will have a significant impact on flow assurance engineering and design with respect to building safe and efficient hydrate management techniques for future deep-water developments.

  16. Collaborative framework for PIV uncertainty quantification: the experimental database

    NASA Astrophysics Data System (ADS)

    Neal, Douglas R.; Sciacchitano, Andrea; Smith, Barton L.; Scarano, Fulvio

    2015-07-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  17. CURVATURE EFFECT QUANTIFICATION FOR IN-VIVO IR THERMOGRAPHY.

    PubMed

    Cheng, Tze-Yuan; Deng, Daxiang; Herman, Cila

    2012-01-01

    Medical Infrared (IR) Imaging has become an important diagnostic tool over recent years. However, one underlying problem in medical diagnostics is associated with accurate quantification of body surface temperatures. This problem is caused by the artifacts induced by the curvature of objects, which leads to inaccurate temperature mapping and biased diagnostic results. Therefore, in our study, an experiment-based analysis is conducted to address the curvature effects toward the 3D temperature reconstruction of the IR thermography image. For quantification purposes, an isothermal copper plate with flat surface, and a cylindrical metal container filled with water are imaged. For the flat surface, the tilting angle measured from camera axis was varied incrementally from 0° to 60 °, such that the effects of surface viewing angle and travel distance on the measured temperature can be explored. On the cylindrical curved surface, the points viewed from 0° to 90° with respect to the camera axis are simultaneously imaged at different temperature levels. The experimental data obtained for the flat surface indicate that both viewing angle and distance effects become noticeable for angles over 40 °. The travel distance contributes a minor change when compared with viewing angle. The experimental results from the curved surface indicate that the curvature effect becomes pronounced when the viewing angle is larger than 60 °. The measurement error on the curved surface is compared with the simulation using the non-dielectric model, and the normalized temperature difference relative to 0° viewing angle was analyzed at six temperature levels. These results indicate that the linear formula associated with directional emissivity is a reasonable approximation for the measurement error, and the normalized error curves change consistently with viewing angle at various temperatures. Therefore, the analysis in this study implies that the directional emissivity based on the non

  18. Comparison of microvolume DNA quantification methods for use with volume-sensitive environmental DNA extracts

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Accurate DNA concentration estimates from environmental samples using minimal sample volumes are essential for most downstream applications. To compare the efficacy of microvolume quantification methods, DNA was extracted from soil, compost, and pure culture samples, and quantified using two absorba...

  19. The Effect of Human Genome Annotation Complexity on RNA-Seq Gene Expression Quantification

    PubMed Central

    Wu, Po-Yen; Phan, John H.; Wang, May D.

    2016-01-01

    Next-generation sequencing (NGS) has brought human genomic research to an unprecedented era. RNA-Seq is a branch of NGS that can be used to quantify gene expression and depends on accurate annotation of the human genome (i.e., the definition of genes and all of their variants or isoforms). Multiple annotations of the human genome exist with varying complexity. However, it is not clear how the choice of genome annotation influences RNA-Seq gene expression quantification. We assess the effect of different genome annotations in terms of (1) mapping quality, (2) quantification variation, (3) quantification accuracy (i.e., by comparing to qRT-PCR data), and (4) the concordance of detecting differentially expressed genes. External validation with qRT-PCR suggests that more complex genome annotations result in higher quantification variation.

  20. Monte Carlo Simulation for Quantification of Light Transport Features in Apples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Light interaction with turbid biological materials involves absorption and scattering. Quantitative understanding of light propagation features in the fruit is critical to designing better optical systems for inspection of food quality. This article reports on the quantification of light propagation...

  1. Relative quantification of biomarkers using mixed-isotope labeling coupled with MS

    PubMed Central

    Chapman, Heidi M; Schutt, Katherine L; Dieter, Emily M; Lamos, Shane M

    2013-01-01

    The identification and quantification of important biomarkers is a critical first step in the elucidation of biological systems. Biomarkers take many forms as cellular responses to stimuli and can be manifested during transcription, translation, and/or metabolic processing. Increasingly, researchers have relied upon mixed-isotope labeling (MIL) coupled with MS to perform relative quantification of biomarkers between two or more biological samples. MIL effectively tags biomarkers of interest for ease of identification and quantification within the mass spectrometer by using isotopic labels that introduce a heavy and light form of the tag. In addition to MIL coupled with MS, a number of other approaches have been used to quantify biomarkers including protein gel staining, enzymatic labeling, metabolic labeling, and several label-free approaches that generate quantitative data from the MS signal response. This review focuses on MIL techniques coupled with MS for the quantification of protein and small-molecule biomarkers. PMID:23157360

  2. SIMPLE METHOD FOR THE REPRESENTATION, QUANTIFICATION, AND COMPARISON OF THE VOLUMES AND SHAPES OF CHEMICAL COMPOUNDS

    EPA Science Inventory

    A conceptually and computationally simple method for the definition, display, quantification, and comparison of the shapes of three-dimensional mathematical molecular models is presented. Molecular or solvent-accessible volume and surface area can also be calculated. Algorithms, ...

  3. Best practices for metabolite quantification in drug development: updated recommendation from the European Bioanalysis Forum.

    PubMed

    Timmerman, Philip; Blech, Stefan; White, Stephen; Green, Martha; Delatour, Claude; McDougall, Stuart; Mannens, Geert; Smeraglia, John; Williams, Stephen; Young, Graeme

    2016-06-01

    Metabolite quantification and profiling continues to grow in importance in today's drug development. The guidance provided by the 2008 FDA Metabolites in Safety Testing Guidance and the subsequent ICH M3(R2) Guidance (2009) has led to a more streamlined process to assess metabolite exposures in preclinical and clinical studies in industry. In addition, the European Bioanalysis Forum (EBF) identified an opportunity to refine the strategies on metabolite quantification considering the experience to date with their recommendation paper on the subject dating from 2010 and integrating the recent discussions on the tiered approach to bioanalytical method validation with focus on metabolite quantification. The current manuscript summarizes the discussion and recommendations from a recent EBF Focus Workshop into an updated recommendation for metabolite quantification in drug development. PMID:27217058

  4. Detection and quantification of delamination in laminated plates from the phase of appropriate guided wave modes

    NASA Astrophysics Data System (ADS)

    Amjad, Umar; Yadav, Susheel Kumar; Kundu, Tribikram

    2016-01-01

    Applicability of specific Lamb wave modes for delamination detection and quantification in a laminated aluminum plate is investigated. The Lamb modes were generated in the plate using a broadband piezoelectric transducer structured with a rigid electrode. Appropriate excitation frequencies and modes for inspection were selected from theoretical dispersion curves. Sensitivity of antisymmetric and symmetric modes for delamination detection and quantification has been investigated using the Hilbert-Huang transform. The mode conversion phenomenon of Lamb waves during progressive delamination is observed. The antisymmetric mode is found to be more reliable for delamination detection and quantification. In this investigation, the changes in the phase of guided Lamb wave modes are related to the degree of delamination, unlike other studies, where mostly the attenuation of the propagating waves has been related to the extent of the internal damage, such as cracks and corrosions. Appropriate features for delamination detection and quantification are extracted from the experimental data.

  5. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  6. Neutron-encoded mass signatures for multi-plexed proteome quantification

    PubMed Central

    Hebert, Alexander S; Merrill, Anna E; Bailey, Derek J; Still, Amelia J; Westphall, Michael S; Streiter, Eric R; Pagliarini, David J; Coon, Joshua J

    2013-01-01

    We describe a protein quantification method that exploits the subtle mass differences caused by neutron-binding energy variation in stable isotopes. These mass differences are synthetically encoded into amino acids and incorporated into yeast and mouse proteins with metabolic labeling; analysis with high mass resolution (>100,000) reveals the isotopologue-embedded peptide signals permitting quantification. We conclude neutron encoding will enable high levels of multi-plexing (> 10) with high dynamic range and accuracy. PMID:23435260

  7. Quantification of Internalized Silica Nanoparticles via STED Microscopy

    PubMed Central

    Peuschel, Henrike; Ruckelshausen, Thomas; Cavelius, Christian; Kraegeloh, Annette

    2015-01-01

    The development of safe engineered nanoparticles (NPs) requires a detailed understanding of their interaction mechanisms on a cellular level. Therefore, quantification of NP internalization is crucial to predict the potential impact of intracellular NP doses, providing essential information for risk assessment as well as for drug delivery applications. In this study, the internalization of 25 nm and 85 nm silica nanoparticles (SNPs) in alveolar type II cells (A549) was quantified by application of super-resolution STED (stimulated emission depletion) microscopy. Cells were exposed to equal particle number concentrations (9.2 × 1010 particles mL−1) of each particle size and the sedimentation of particles during exposure was taken into account. Microscopy images revealed that particles of both sizes entered the cells after 5 h incubation in serum supplemented and serum-free medium. According to the in vitro sedimentation, diffusion, and dosimetry (ISDD) model 20–27% of the particles sedimented. In comparison, 102-103 NPs per cell were detected intracellularly serum-containing medium. Furthermore, in the presence of serum, no cytotoxicity was induced by the SNPs. In serum-free medium, large agglomerates of both particle sizes covered the cells whereas only high concentrations (≥ 3.8 × 1012 particles mL−1) of the smaller particles induced cytotoxicity. PMID:26125028

  8. Uncertainty quantification in the catalytic partial oxidation of methane

    NASA Astrophysics Data System (ADS)

    Navalho, Jorge E. P.; Pereira, José M. C.; Ervilha, Ana R.; Pereira, José C. F.

    2013-12-01

    This work focuses on uncertainty quantification of eight random parameters required as input for 1D modelling of methane catalytic partial oxidation within a highly dense foam reactor. Parameters related to geometrical properties, reactor thermophysics and catalyst loading are taken as uncertain. A widely applied 1D heterogeneous mathematical model that accounts for proper transport and surface chemistry steps is considered for the evaluation of deterministic samples. The non-intrusive spectral projection approach based on polynomial chaos expansion is applied to determine the stochastic temperature and species profiles along the reactor axial direction as well as their ensemble mean and error bars with a confidence interval of 95%. Probability density functions of relevant variables in specific reactor sections are also analysed. A different contribution is noticed from each random input to the total uncertainty range. Porosity, specific surface area and catalyst loading appear as the major sources of uncertainty to bulk gas and surface temperature and species molar profiles. Porosity and the mean pore diameter have an important impact on the pressure drop along the whole reactor as expected. It is also concluded that any trace of uncertainty in the eight input random variables can be almost dissipated near the catalyst outlet section for a long-enough catalyst, mainly due to the approximation to thermodynamic equilibrium.

  9. Quantification of Covariance in Tropical Cyclone Activity across Teleconnected Basins

    NASA Astrophysics Data System (ADS)

    Tolwinski-Ward, S. E.; Wang, D.

    2015-12-01

    Rigorous statistical quantification of natural hazard covariance across regions has important implications for risk management, and is also of fundamental scientific interest. We present a multivariate Bayesian Poisson regression model for inferring the covariance in tropical cyclone (TC) counts across multiple ocean basins and across Saffir-Simpson intensity categories. Such covariability results from the influence of large-scale modes of climate variability on local environments that can alternately suppress or enhance TC genesis and intensification, and our model also simultaneously quantifies the covariance of TC counts with various climatic modes in order to deduce the source of inter-basin TC covariability. The model explicitly treats the time-dependent uncertainty in observed maximum sustained wind data, and hence the nominal intensity category of each TC. Differences in annual TC counts as measured by different agencies are also formally addressed. The probabilistic output of the model can be probed for probabilistic answers to such questions as: - Does the relationship between different categories of TCs differ statistically by basin? - Which climatic predictors have significant relationships with TC activity in each basin? - Are the relationships between counts in different basins conditionally independent given the climatic predictors, or are there other factors at play affecting inter-basin covariability? - How can a portfolio of insured property be optimized across space to minimize risk? Although we present results of our model applied to TCs, the framework is generalizable to covariance estimation between multivariate counts of natural hazards across regions and/or across peril types.

  10. Automated quantification of one-dimensional nanostructure alignment on surfaces.

    PubMed

    Dong, Jianjin; Goldthorpe, Irene A; Abukhdeir, Nasser Mohieddin

    2016-06-10

    A method for automated quantification of the alignment of one-dimensional (1D) nanostructures from microscopy imaging is presented. Nanostructure alignment metrics are formulated and shown to be able to rigorously quantify the orientational order of nanostructures within a two-dimensional domain (surface). A complementary image processing method is also presented which enables robust processing of microscopy images where overlapping nanostructures might be present. Scanning electron microscopy (SEM) images of nanowire-covered surfaces are analyzed using the presented methods and it is shown that past single parameter alignment metrics are insufficient for highly aligned domains. Through the use of multiple parameter alignment metrics, automated quantitative analysis of SEM images is shown to be possible and the alignment characteristics of different samples are able to be quantitatively compared using a similarity metric. The results of this work provide researchers in nanoscience and nanotechnology with a rigorous method for the determination of structure/property relationships, where alignment of 1D nanostructures is significant. PMID:27119552

  11. Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites

    NASA Astrophysics Data System (ADS)

    Borkowski, Luke; Chattopadhyay, Aditi

    2014-03-01

    Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.

  12. Arrays on disc for screening and quantification of pollutants.

    PubMed

    Navarro, Patricia; Morais, Sergi; Gabaldón, Jose A; Pérez, Antonio J; Puchades, Rosa; Maquieira, Angel

    2013-06-19

    A rapid compact disc based methodology for screening and quantification of organic pollutants in mandarin juices is presented. The assay is established on the coating conjugate indirect competitive principle and developed under disc-array configuration. The detection is based on the acquisition of attenuated reflective signals that were proportional to optical density of the immunoreaction product. The competitive assay is applied to quantify simultaneously, in a selective manner, non-systemic insecticides in mandarin juices. The detection limits were 0.2 and 0.1 μg L(-1) and the sensitivity 2.1 and 1.5 μg L(-1), for chlorpyrifos and fenthion, respectively. Pollutants were directly quantified after sample dilution in a total time of 40 min. Also, the implementation of positive and negative controls into the array configuration served as an automatic quality control test. The effect of thermal treatment on pesticide dissipation was studied and found that it was insignificant under the studied conditions. Recovery intervals ranged from 96-105% to 94-103%, for chlorpyrifos and fenthion, respectively and were similar to those obtained with gas chromatography coupled to mass spectrometry. In the current configuration, 64 samples can be simultaneously analyzed on a disc at a very competitive value, demonstrating its potential for high-throughput multiplexed screening applications for controlled monitoring programs in low-level labs or outside the lab setting. PMID:23746409

  13. NMR method for accurate quantification of polysorbate 80 copolymer composition.

    PubMed

    Zhang, Qi; Wang, Aifa; Meng, Yang; Ning, Tingting; Yang, Huaxin; Ding, Lixia; Xiao, Xinyue; Li, Xiaodong

    2015-10-01

    (13)C NMR spectroscopic integration employing short relaxation delays and a 30° pulse width was evaluated as a quantitative tool for analyzing the components of polysorbate 80. (13)C NMR analysis revealed that commercial polysorbate 80 formulations are a complex oligomeric mixture of sorbitan polyethoxylate esters and other intermediates, such as isosorbide polyethoxylate esters and poly(ethylene glycol) (PEG) esters. This novel approach facilitates the quantification of the component ratios. In this study, the ratios of the three major oligomers in polysorbate 80 were measured and the PEG series was found to be the major component of commercial polysorbate 80. The degree of polymerization of -CH2CH2O- groups and the ratio of free to bonded -CH2CH2O- end groups, which correlate with the hydrophilic/hydrophobic nature of the polymer, were analyzed, and were suggested to be key factors for assessing the likelihood of adverse biological reactions to polysorbate 80. The (13)C NMR data suggest that the feed ratio of raw materials and reaction conditions in the production of polysorbate 80 are not well controlled. Our results demonstrate that (13)C NMR is a universal, powerful tool for polysorbate analysis. Such analysis is crucial for the synthesis of a high-quality product, and is difficult to obtain by other methods. PMID:26356097

  14. Cloning and quantification of ferret serum amyloid A.

    PubMed

    Aratani, Hitoshi; Segawa, Takao; Itou, Takuya; Sakai, Takeo

    2013-01-31

    Serum amyloid A (SAA) is used as a biomarker for infections and inflammation in humans and veterinary medicine. We cloned ferret cDNA encoding SAA from the liver of a ferret via reverse transcription PCR (RT-PCR). The sequence of the cDNA clone revealed that ferret SAA has an open reading frame of 387 bp that encodes 129 amino acids. The deduced amino acid sequence of ferret SAA has 96.1, 89.9, 86.0, 83.8, 83.0, 73.8 and 65.3% similarity to the mink, dog, cat, cattle, horse, human and mouse SAA genes, respectively. Compared to human SAA, the deduced ferret SAA amino acid sequence had an insertion of an 8-amino acid fragment between amino acids 88 and 95. Recombinant ferret SAA (rfrSAA) was expressed using an Escherichia coli (E. coli) strain, BL21 Star. Using Western blot analysis, anti-SAA mAb provided with the multispecies SAA ELISA kit reacted with purified rfrSAA. A significant dose-response relationship was observed between the rfrSAA protein and a commercial multispecies SAA ELISA kit. In contrast, rfrSAA was not recognized with the antibodies included in a commercial human SAA ELISA kit. These results suggest that the structure of ferret SAA is antigenically similar to other domestic animal SAAs, and the multispecies ELISA kit allows for the detection and quantification of ferret SAA in vivo. PMID:22972465

  15. Significance of DNA quantification in testicular germ cell tumors.

    PubMed

    Codesal, J; Paniagua, R; Regadera, J; Fachal, C; Nistal, M

    1991-01-01

    A cytophotometric quantification of DNA in tumor cells was performed in histological sections of orchidectomy specimens from 36 men with testicular germ cell tumors (TGCT), 7 of them showing more than one tumor type. Among the variants of seminoma (classic and spermatocytic) the lowest DNA content were in spermatocytic seminoma. With respect to non-seminomatous tumors (yolk sac tumor, embryonal carcinoma, teratoma, and choriocarcinoma), choriocarcinomas showed the highest DNA content, and the lowest value was found in teratomas. No significant differences were found between the average DNA content of seminomas (all types) and non-seminomatous tumors (all types). Both embryonal carcinoma and yolk sac tumor showed similar DNA content when they were the sole tumor and when they were found associated with other tumors. In this study, except for the 4 cases of teratoma and the case of spermatocytic seminoma, all TGCT examined did not show modal values of DNA content in the diploid range. Such an elevated frequency of aneuploidism in these tumors may be helpful for their diagnosis. PMID:1666273

  16. An Uncertainty Quantification System for Tabular Equations of State

    NASA Astrophysics Data System (ADS)

    Carpenter, John; Robinson, Allen; Debusschere, Bert; Mattsson, Ann; Drake, Richard; Rider, William

    2013-06-01

    Providing analysts with information regarding the accuracy of computational models is key for enabling predictive design and engineering. Uncertainty in material models can make significant contributions to the overall uncertainty in calculations. As a first step toward tackling this large problem, we present an uncertainty quantification system for tabular equations of state (EOS). First a posterior distribution of EOS model parameters is inferred using Bayes rule and a set of experimental and computational data. EOS tables are generated for parameter states sampled from the posterior distribution. A new unstructured triangular table format allows for capturing multi-phase model behavior. A principal component analysis then reduces this set of tables to a mean table and most significant perturbations. This final set of tables is provided to hydrocodes for performing simulations using standard non-intrusive uncertainty propagation methods. A multi-phase aluminum model is used to demonstrate the system. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  17. Quantification of sugars in breakfast cereals using capillary electrophoresis.

    PubMed

    Toutounji, Michelle R; Van Leeuwen, Matthew P; Oliver, James D; Shrestha, Ashok K; Castignolles, Patrice; Gaborieau, Marianne

    2015-05-18

    About 80% of the Australian population consumes breakfast cereal (BC) at least five days a week. With high prevalence rates of obesity and other diet-related diseases, improved methods for monitoring sugar levels in breakfast cereals would be useful in nutrition research. The heterogeneity of the complex matrix of BCs can make carbohydrate analysis challenging or necessitate tedious sample preparation leading to potential sugar loss or starch degradation into sugars. A recently established, simple and robust free solution capillary electrophoresis (CE) method was used in a new application to 13 BCs (in Australia) and compared with several established methods for quantification of carbohydrates. Carbohydrates identified in BCs by CE included sucrose, maltose, glucose and fructose. The CE method is simple requiring no sample preparation or derivatization and carbohydrates are detected by direct UV detection. CE was shown to be a more robust and accurate method for measuring carbohydrates than Fehling method, DNS (3,5-dinitrosalicylic acid) assay and HPLC (high performance liquid chromatography). PMID:25841355

  18. Interactive image quantification tools in nuclear material forensics

    SciTech Connect

    Porter, Reid B; Ruggiero, Christy; Hush, Don; Harvey, Neal; Kelly, Pat; Scoggins, Wayne; Tandon, Lav

    2011-01-03

    Morphological and microstructural features visible in microscopy images of nuclear materials can give information about the processing history of a nuclear material. Extraction of these attributes currently requires a subject matter expert in both microscopy and nuclear material production processes, and is a time consuming, and at least partially manual task, often involving multiple software applications. One of the primary goals of computer vision is to find ways to extract and encode domain knowledge associated with imagery so that parts of this process can be automated. In this paper we describe a user-in-the-loop approach to the problem which attempts to both improve the efficiency of domain experts during image quantification as well as capture their domain knowledge over time. This is accomplished through a sophisticated user-monitoring system that accumulates user-computer interactions as users exploit their imagery. We provide a detailed discussion of the interactive feature extraction and segmentation tools we have developed and describe our initial results in exploiting the recorded user-computer interactions to improve user productivity over time.

  19. Interactive image quantification tools in nuclear material forensics

    NASA Astrophysics Data System (ADS)

    Porter, Reid; Ruggiero, Christy; Hush, Don; Harvey, Neal; Kelly, Patrick; Scoggins, Wayne; Tandon, Lav

    2011-03-01

    Morphological and microstructural features visible in microscopy images of nuclear materials can give information about the processing history of a nuclear material. Extraction of these attributes currently requires a subject matter expert in both microscopy and nuclear material production processes, and is a time consuming, and at least partially manual task, often involving multiple software applications. One of the primary goals of computer vision is to find ways to extract and encode domain knowledge associated with imagery so that parts of this process can be automated. In this paper we describe a user-in-the-loop approach to the problem which attempts to both improve the efficiency of domain experts during image quantification as well as capture their domain knowledge over time. This is accomplished through a sophisticated user-monitoring system that accumulates user-computer interactions as users exploit their imagery. We provide a detailed discussion of the interactive feature extraction and segmentation tools we have developed and describe our initial results in exploiting the recorded user-computer interactions to improve user productivity over time.

  20. Xanthine derivatives quantification in serum by capillary zone electrophoresis.

    PubMed

    Peris-Vicente, Juan; Rambla-Alegre, Maria; Durgavanshi, Abhilasha; Bose, Devasish; Esteve-Romero, Josep; Marco-Peiró, Sergio

    2014-10-01

    A capillary electrophoresis method was developed to quantify caffeine and theophylline, xanthine derivatives with bronchodilator activity. Buffer concentration, pH and applied voltage were optimized using a central composite design-face centred. Separation conditions were: silica capillary tube, 75 μm (i.d.) and 61 cm (total length); absorbance detection, 280 nm; borate buffer, 20 mM, pH 9.0; applied voltage, 25 kV and 1 psi injection/8 s. Validation was performed in blank serum following the International Conference Harmonization guidelines: resolution (peaks without overlapping), linear range (0.125-50 µg/mL; r(2) > 0.9999), limits of detection and quantification (10; 20 and 33; 66 ppb for caffeine and theophylline, respectively), intra- and inter-day precision (Relative standard deviation lower than 1.9%) and accuracy (98-101%). Migration times were <8 min. This method is simple, specific and suitable and reaches high label claims (98.7-100.4%) in pharmaceutical formulations analysis. Moreover, the method was applied to the monitoring of the analytes in serum of patients. PMID:24220991

  1. In vivo cell tracking and quantification method in adult zebrafish

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Alt, Clemens; Li, Pulin; White, Richard M.; Zon, Leonard I.; Wei, Xunbin; Lin, Charles P.

    2012-03-01

    Zebrafish have become a powerful vertebrate model organism for drug discovery, cancer and stem cell research. A recently developed transparent adult zebrafish using double pigmentation mutant, called casper, provide unparalleled imaging power in in vivo longitudinal analysis of biological processes at an anatomic resolution not readily achievable in murine or other systems. In this paper we introduce an optical method for simultaneous visualization and cell quantification, which combines the laser scanning confocal microscopy (LSCM) and the in vivo flow cytometry (IVFC). The system is designed specifically for non-invasive tracking of both stationary and circulating cells in adult zebrafish casper, under physiological conditions in the same fish over time. The confocal imaging part in this system serves the dual purposes of imaging fish tissue microstructure and a 3D navigation tool to locate a suitable vessel for circulating cell counting. The multi-color, multi-channel instrument allows the detection of multiple cell populations or different tissues or organs simultaneously. We demonstrate initial testing of this novel instrument by imaging vasculature and tracking circulating cells in CD41: GFP/Gata1: DsRed transgenic casper fish whose thrombocytes/erythrocytes express the green and red fluorescent proteins. Circulating fluorescent cell incidents were recorded and counted repeatedly over time and in different types of vessels. Great application opportunities in cancer and stem cell researches are discussed.

  2. Uncertainty Quantification applied to flow simulations in thoracic aortic aneurysms

    NASA Astrophysics Data System (ADS)

    Boccadifuoco, Alessandro; Mariotti, Alessandro; Celi, Simona; Martini, Nicola; Salvetti, Maria Vittoria

    2015-11-01

    The thoracic aortic aneurysm is a progressive dilatation of the thoracic aorta causing a weakness in the aortic wall, which may eventually cause life-threatening events. Clinical decisions on treatment strategies are currently based on empiric criteria, like the aortic diameter value or its growth rate. Numerical simulations can give the quantification of important indexes which are impossible to be obtained through in-vivo measurements and can provide supplementary information. Hemodynamic simulations are carried out by using the open-source tool SimVascular and considering patient-specific geometries. One of the main issues in these simulations is the choice of suitable boundary conditions, modeling the organs and vessels not included in the computational domain. The current practice is to use outflow conditions based on resistance and capacitance, whose values are tuned to obtain a physiological behavior of the patient pressure. However it is not known a priori how this choice affects the results of the simulation. The impact of the uncertainties in these outflow parameters is investigated here by using the generalized Polynomial Chaos approach. This analysis also permits to calibrate the outflow-boundary parameters when patient-specific in-vivo data are available.

  3. Bioluminescence regenerative cycle (BRC) system for nucleic acid quantification assays

    NASA Astrophysics Data System (ADS)

    Hassibi, Arjang; Lee, Thomas H.; Davis, Ronald W.; Pourmand, Nader

    2003-07-01

    A new label-free methodology for nucleic acid quantification has been developed where the number of pyrophosphate molecules (PPi) released during polymerization of the target nucleic acid is counted and correlated to DNA copy number. The technique uses the enzymatic complex of ATP-sulfurylase and firefly luciferase to generate photons from PPi. An enzymatic unity gain positive feedback is also implemented to regenerate the photon generation process and compensate any decay in light intensity by self regulation. Due to this positive feedback, the total number of photons generated by the bioluminescence regenerative cycle (BRC) can potentially be orders of magnitude higher than typical chemiluminescent processes. A system level kinetic model that incorporates the effects of contaminations and detector noise was used to show that the photon generation process is in fact steady and also proportional to the nucleic acid quantity. Here we show that BRC is capable of detecting quantities of DNA as low as 1 amol (10-18 mole) in 40μlit aqueous solutions, and this enzymatic assay has a controllable dynamic range of 5 orders of magnitude. The sensitivity of this technology, due to the excess number of photons generated by the regenerative cycle, is not constrained by detector performance, but rather by possible PPi or ATP (adenosine triphosphate) contamination, or background bioluminescence of the enzymatic complex.

  4. Quantification of Nociceptive Escape Response in C.elegans

    NASA Astrophysics Data System (ADS)

    Leung, Kawai; Mohammadi, Aylia; Ryu, William; Nemenman, Ilya

    2013-03-01

    Animals cannot rank and communicate their pain consciously. Thus in pain studies on animal models, one must infer the pain level from high precision experimental characterization of behavior. This is not trivial since behaviors are very complex and multidimensional. Here we explore the feasibility of C.elegans as a model for pain transduction. The nematode has a robust neurally mediated noxious escape response, which we show to be partially decoupled from other sensory behaviors. We develop a nociceptive behavioral response assay that allows us to apply controlled levels of pain by locally heating worms with an IR laser. The worms' motions are captured by machine vision programming with high spatiotemporal resolution. The resulting behavioral quantification allows us to build a statistical model for inference of the experienced pain level from the behavioral response. Based on the measured nociceptive escape of over 400 worms, we conclude that none of the simple characteristics of the response are reliable indicators of the laser pulse strength. Nonetheless, a more reliable statistical inference of the pain stimulus level from the measured behavior is possible based on a complexity-controlled regression model that takes into account the entire worm behavioral output. This work was partially supported by NSF grant No. IOS/1208126 and HFSP grant No. RGY0084/2011.

  5. Accurate quantification of cells recovered by bronchoalveolar lavage.

    PubMed

    Saltini, C; Hance, A J; Ferrans, V J; Basset, F; Bitterman, P B; Crystal, R G

    1984-10-01

    Quantification of the differential cell count and total number of cells recovered from the lower respiratory tract by bronchoalveolar lavage is a valuable technique for evaluating the alveolitis of patients with inflammatory disorders of the lower respiratory tract. The most commonly used technique for the evaluation of cells recovered by lavage has been to concentrate cells by centrifugation and then to determine total cell number using a hemocytometer and differential cell count from a Wright-Glemsa-stained cytocentrifuge preparation. However, we have noted that the percentage of small cells present in the original cell suspension recovered by lavage is greater than the percentage of lymphocytes identified on cytocentrifuge preparations. Therefore, we developed procedures for determining differential cell counts on lavage cells collected on Millipore filters and stained with hematoxylin-eosin (filter preparations) and compared the results of differential cell counts performed on filter preparations with those obtained using cytocentrifuge preparations. When cells recovered by lavage were collected on filter preparations, accurate differential cell counts were obtained, as confirmed by performing differential cell counts on cell mixtures of known composition, and by comparing differential cell counts obtained using filter preparations stained with hematoxylin-eosin with those obtained using filter preparations stained with a peroxidase cytochemical stain. The morphology of cells displayed on filter preparations was excellent, and interobserver variability in quantitating cell types recovered by lavage was less than 3%.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:6385789

  6. Mesh refinement for uncertainty quantification through model reduction

    NASA Astrophysics Data System (ADS)

    Li, Jing; Stinis, Panos

    2015-01-01

    We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan-Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory.

  7. Integrated Assessment Modeling for Carbon Storage Risk and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Bromhal, G. S.; Dilmore, R.; Pawar, R.; Stauffer, P. H.; Gastelum, J.; Oldenburg, C. M.; Zhang, Y.; Chu, S.

    2013-12-01

    The National Risk Assessment Partnership (NRAP) has developed tools to perform quantitative risk assessment at site-specific locations for long-term carbon storage. The approach that is being used is to divide the storage and containment system into components (e.g., reservoirs, seals, wells, groundwater aquifers), to develop detailed models for each component, to generate reduced order models (ROMs) based on the detailed models, and to reconnect the reduced order models within an integrated assessment model (IAM). CO2-PENS, developed at Los Alamos National Lab, is being used as the IAM for the simulations in this study. The benefit of this approach is that simulations of the complete system can be generated on a relatively rapid time scale so that Monte Carlo simulation can be performed. In this study, hundreds of thousands of runs of the IAMs have been generated to estimate likelihoods of the quantity of CO2 released to the atmosphere, size of aquifer impacted by pH, size of aquifer impacted by TDS, and size of aquifer with different metals concentrations. Correlations of the output variables with different reservoir, seal, wellbore, and aquifer parameters have been generated. Importance measures have been identified, and inputs have been ranked in the order of their impact on the output quantities. Presentation will describe the approach used, representative results, and implications for how the Monte Carlo analysis is implemented on uncertainty quantification.

  8. A heme fusion tag for protein affinity purification and quantification

    PubMed Central

    Asher, Wesley B; Bren, Kara L

    2010-01-01

    We report a novel affinity-based purification method for proteins expressed in Escherichia coli that uses the coordination of a heme tag to an l-histidine-immobilized sepharose (HIS) resin. This approach provides an affinity purification tag visible to the eye, facilitating tracking of the protein. We show that azurin and maltose binding protein are readily purified from cell lysate using the heme tag and HIS resin. Mild conditions are used; heme-tagged proteins are bound to the HIS resin in phosphate buffer, pH 7.0, and eluted by adding 200–500 mM imidazole or binding buffer at pH 5 or 8. The HIS resin exhibits a low level of nonspecific binding of untagged cellular proteins for the systems studied here. An additional advantage of the heme tag-HIS method for purification is that the heme tag can be used for protein quantification by using the pyridine hemochrome absorbance method for heme concentration determination. PMID:20665691

  9. Micelle Mediated Trace Level Sulfide Quantification through Cloud Point Extraction

    PubMed Central

    Devaramani, Samrat; Malingappa, Pandurangappa

    2012-01-01

    A simple cloud point extraction protocol has been proposed for the quantification of sulfide at trace level. The method is based on the reduction of iron (III) to iron (II) by the sulfide and the subsequent complexation of metal ion with nitroso-R salt in alkaline medium. The resulting green-colored complex was extracted through cloud point formation using cationic surfactant, that is, cetylpyridinium chloride, and the obtained surfactant phase was homogenized by ethanol before its absorbance measurement at 710 nm. The reaction variables like metal ion, ligand, surfactant concentration, and medium pH on the cloud point extraction of the metal-ligand complex have been optimized. The interference effect of the common anions and cations was studied. The proposed method has been successfully applied to quantify the trace level sulfide in the leachate samples of the landfill and water samples from bore wells and ponds. The validity of the proposed method has been studied by spiking the samples with known quantities of sulfide as well as comparing with the results obtained by the standard method. PMID:22619597

  10. Quantification of regional myocardial wall motion by cardiovascular magnetic resonance

    PubMed Central

    Jiang, Kai

    2014-01-01

    Cardiovascular magnetic resonance (CMR) is a versatile tool that also allows comprehensive and accurate measurement of both global and regional myocardial contraction. Quantification of regional wall motion parameters, such as strain, strain rate, twist and torsion, has been shown to be more sensitive to early-stage functional alterations. Since the invention of CMR tagging by magnetization saturation in 1988, several CMR techniques have been developed to enable the measurement of regional myocardial wall motion, including myocardial tissue tagging, phase contrast mapping, displacement encoding with stimulated echoes (DENSE), and strain encoded (SENC) imaging. These techniques have been developed with their own advantages and limitations. In this review, two widely used and closely related CMR techniques, i.e., tissue tagging and DENSE, will be discussed from the perspective of pulse sequence development and image-processing techniques. The clinical and preclinical applications of tissue tagging and DENSE in assessing wall motion mechanics in both normal and diseased hearts, including coronary artery diseases, hypertrophic cardiomyopathy, aortic stenosis, and Duchenne muscular dystrophies, will be discussed. PMID:25392821

  11. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    NASA Astrophysics Data System (ADS)

    Khuwaileh, B. A.; Abdel-Khalik, H. S.

    2015-01-01

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  12. Sulfathiazole: analytical methods for quantification in seawater and macroalgae.

    PubMed

    Leston, Sara; Nebot, Carolina; Nunes, Margarida; Cepeda, Alberto; Pardal, Miguel Ângelo; Ramos, Fernando

    2015-01-01

    The awareness of the interconnection between pharmaceutical residues, human health, and aquaculture has highlighted the concern with the potential harmful effects it can induce. Furthermore, to better understand the consequences more research is needed and to achieve that new methodologies on the detection and quantification of pharmaceuticals are necessary. Antibiotics are a major class of drugs included in the designation of emerging contaminants, representing a high risk to natural ecosystems. Among the most prescribed are sulfonamides, with sulfathiazole being the selected compound to be investigated in this study. In the environment, macroalgae are an important group of producers, continuously exposed to contaminants, with a significant role in the trophic web. Due to these characteristics are already under scope for the possibility of being used as bioindicators. The present study describes two new methodologies based on liquid chromatography for the determination of sulfathiazole in seawater and in the green macroalgae Ulva lactuca. Results show both methods were validated according to international standards, with MS/MS detection showing more sensitivity as expected with LODs of 2.79ng/g and 1.40ng/mL for algae and seawater, respectively. As for UV detection the values presented were respectively 2.83μg/g and 2.88μg/mL, making it more suitable for samples originated in more contaminated sites. The methods were also applied to experimental data with success with results showing macroalgae have potential use as indicators of contamination. PMID:25473819

  13. Impact Induced Delamination Detection and Quantification With Guided Wavefield Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara A. C.; Yu, Lingyu; Seebo, Jeffrey P.

    2015-01-01

    This paper studies impact induced delamination detection and quantification by using guided wavefield data and spatial wavenumber imaging. The complex geometry impact-like delamination is created through a quasi-static indentation on a CFRP plate. To detect and quantify the impact delamination in the CFRP plate, PZT-SLDV sensing and spatial wavenumber imaging are performed. In the PZT-SLDV sensing, the guided waves are generated from the PZT, and the high spatial resolution guided wavefields are measured by the SLDV. The guided wavefield data acquired from the PZT-SLDV sensing represent guided wave propagation in the composite laminate and include guided wave interaction with the delamination damage. The measured guided wavefields are analyzed through the spatial wavenumber imaging method, which generates an image containing the dominant local wavenumber at each spatial location. The spatial wavenumber imaging result for the simple single layer Teflon insert delamination provided quantitative information on delamination damage size and location. The location of delamination damage is indicated by the area with larger wavenumbers in the spatial wavenumber image. The impact-like delamination results only partially agreed with the damage size and shape. The results also demonstrated the dependence on excitation frequency. Future work will further investigate the accuracy of the wavenumber imaging method for real composite damage and the dependence on frequency of excitation.

  14. Comprehensive modeling of electrostatically actuated MEMS beams including uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Snow, Michael G.

    MEMS switches have offered dramatic improvements in the performance of RF systems. However, difficulties with reliability has slowed the adoption of MEMS switches in RF systems. These reliability issues are partly due to the poor manufacturing tolerances endemic to MEMS manufacturing processes. These manufacturing tolerances may cause significant variations in performance characteristics. This work focuses on electrostatically actuated MEMS beam capacitive shunt switches. A non-linear dynamic model for these switches was developed. The model accounts for a variety of physical effects including; beam stretching, residual stress, non-rigid boundary conditions, initial curvature, electrostatic fringing field, finite electrodes, squeeze film damping, and distributed contact. The effects of uncertain parameters on the outputs of the model are discovered through response surface based uncertainty quantification techniques. The model accurately predicts the actuation voltages and switching times of these MEMS switches as well as the effects of uncertain parameters. The derived model is widely applicable and accuratly reproduces the results of other models in the literature. Future researchers will be able to rapidly iterate designs and accurately understand the behavior of these switches.

  15. Quantification of hydroxyacetone and glycolaldehyde using chemical ionization mass spectrometry

    NASA Astrophysics Data System (ADS)

    St. Clair, J. M.; Spencer, K. M.; Beaver, M. R.; Crounse, J. D.; Paulot, F.; Wennberg, P. O.

    2014-04-01

    Chemical ionization mass spectrometry (CIMS) enables online, rapid, in situ detection and quantification of hydroxyacetone and glycolaldehyde. Two different CIMS approaches are demonstrated employing the strengths of single quadrupole mass spectrometry and triple quadrupole (tandem) mass spectrometry. Both methods are generally capable of the measurement of hydroxyacetone, an analyte with known but minimal isobaric interferences. Tandem mass spectrometry provides direct separation of the isobaric compounds glycolaldehyde and acetic acid using distinct, collision-induced dissociation daughter ions. The single quadrupole CIMS measurement of glycolaldehyde was demonstrated during the ARCTAS-CARB (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites - California Air Resources Board) 2008 campaign, while triple quadrupole CIMS measurements of glycolaldehyde and hydroxyacetone were demonstrated during the BEARPEX (Biosphere Effects on Aerosols and Photochemistry Experiment) 2009 campaign. Enhancement ratios of glycolaldehyde in ambient biomass-burning plumes are reported for the ARCTAS-CARB campaign. BEARPEX observations are compared to simple photochemical box model predictions of biogenic volatile organic compound oxidation at the site.

  16. Atomic Resolution Imaging and Quantification of Chemical Functionality of Surfaces

    SciTech Connect

    Schwarz, Udo

    2014-12-10

    The work carried out from 2006-2014 under DoE support was targeted at developing new approaches to the atomic-scale characterization of surfaces that include species-selective imaging and an ability to quantify chemical surface interactions with site-specific accuracy. The newly established methods were subsequently applied to gain insight into the local chemical interactions that govern the catalytic properties of model catalysts of interest to DoE. The foundation of our work was the development of three-dimensional atomic force microscopy (3D-AFM), a new measurement mode that allows the mapping of the complete surface force and energy fields with picometer resolution in space (x, y, and z) and piconewton/millielectron volts in force/energy. From this experimental platform, we further expanded by adding the simultaneous recording of tunneling current (3D-AFM/STM) using chemically well-defined tips. Through comparison with simulations, we were able to achieve precise quantification and assignment of local chemical interactions to exact positions within the lattice. During the course of the project, the novel techniques were applied to surface-oxidized copper, titanium dioxide, and silicon oxide. On these materials, defect-induced changes to the chemical surface reactivity and electronic charge density were characterized with site-specific accuracy.

  17. Uncertainty quantification for large-scale ocean circulation predictions.

    SciTech Connect

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2010-09-01

    Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

  18. MALDI-TOF MS quantification of coccidiostats in poultry feeds.

    PubMed

    Wang, J; Sporns, P

    2000-07-01

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a relatively new technique that is having a great impact on analyses. This study is the first to demonstrate the use of linear MALDI-TOF MS to identify and quantify coccidiostats in poultry feeds. 2,5-Dihydroxybenzoic acid (DHB) was found to be the best matrix. In MALDI-TOF MS, coccidiostats form predominantly [M + Na](+) ions, with additional small amounts of [M + K](+) and [M - H + 2Na](+) ions, and no obvious fragment ions. Salinomycin and narasin were unstable in the concentrated DHB matrix solution but were stable when dried on the MALDI-TOF MS probe. A simple fast Sep-pak C18 cartridge purification procedure was developed for the MALDI-TOF MS quantification of coccidiostats in poultry feeds. The MALDI-TOF MS limit of detection for lasalocid, monensin, salinomycin, and narasin standards was 251, 22, 24, and 24 fmol, respectively. The method detection limit for salinomycin and narasin in poultry feeds was 2.4 microgram/g. PMID:10898626

  19. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    SciTech Connect

    Khuwaileh, B.A. Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  20. Quantification of brodifacoum in plasma and liver tissue by HPLC.

    PubMed

    O'Bryan, S M; Constable, D J

    1991-01-01

    A simple high-performance liquid chromatographic method has been developed for detection and quantification of brodifacoum in plasma and liver tissue. After adding difenacoum as the internal standard, brodifacoum and difenacoum are extracted from 2 mL of plasma with two sequential 10-mL volumes of acetonitrile-ethyl ether (9:1) and from 2 g of liver tissue by grinding the tissue with 10 mL acetonitrile. The extracts are evaporated to dryness under nitrogen, 2 mL of acetonitrile is added to reconstitute the residues, and the resulting solution is analyzed using reversed-phase chromatography and fluorescence detection. The limits of detection for plasma and tissue are 2 micrograms/L and 5 ng/g, respectively. Using internal standardization, the mean intra-assay recovery from plasma is 92% and the mean inter-assay recoveries is 109%. The mean intra-assay and inter-assay recoveries from tissue are 96%. No interferences were observed with any of the following related compounds: brodifacoum, bromadiolone, coumarin, difenacoum, diphacinone, warfarin, and vitamin K1. PMID:1943058

  1. Accurate 3D quantification of the bronchial parameters in MDCT

    NASA Astrophysics Data System (ADS)

    Saragaglia, A.; Fetita, C.; Preteux, F.; Brillet, P. Y.; Grenier, P. A.

    2005-08-01

    The assessment of bronchial reactivity and wall remodeling in asthma plays a crucial role in better understanding such a disease and evaluating therapeutic responses. Today, multi-detector computed tomography (MDCT) makes it possible to perform an accurate estimation of bronchial parameters (lumen and wall areas) by allowing a quantitative analysis in a cross-section plane orthogonal to the bronchus axis. This paper provides the tools for such an analysis by developing a 3D investigation method which relies on 3D reconstruction of bronchial lumen and central axis computation. Cross-section images at bronchial locations interactively selected along the central axis are generated at appropriate spatial resolution. An automated approach is then developed for accurately segmenting the inner and outer bronchi contours on the cross-section images. It combines mathematical morphology operators, such as "connection cost", and energy-controlled propagation in order to overcome the difficulties raised by vessel adjacencies and wall irregularities. The segmentation accuracy was validated with respect to a 3D mathematically-modeled phantom of a pair bronchus-vessel which mimics the characteristics of real data in terms of gray-level distribution, caliber and orientation. When applying the developed quantification approach to such a model with calibers ranging from 3 to 10 mm diameter, the lumen area relative errors varied from 3.7% to 0.15%, while the bronchus area was estimated with a relative error less than 5.1%.

  2. Relative Quantification of Several Plasma Proteins during Liver Transplantation Surgery

    PubMed Central

    Parviainen, Ville; Joenväärä, Sakari; Tukiainen, Eija; Ilmakunnas, Minna; Isoniemi, Helena; Renkonen, Risto

    2011-01-01

    Plasma proteome is widely used in studying changes occurring in human body during disease or other disturbances. Immunological methods are commonly used in such studies. In recent years, mass spectrometry has gained popularity in high-throughput analysis of plasma proteins. In this study, we tested whether mass spectrometry and iTRAQ-based protein quantification might be used in proteomic analysis of human plasma during liver transplantation surgery to characterize changes in protein abundances occurring during early graft reperfusion. We sampled blood from systemic circulation as well as blood entering and exiting the liver. After immunodepletion of six high-abundant plasma proteins, trypsin digestion, iTRAQ labeling, and cation-exchange fractionation, the peptides were analyzed by reverse phase nano-LC-MS/MS. In total, 72 proteins were identified of which 31 could be quantified in all patient specimens collected. Of these 31 proteins, ten, mostly medium-to-high abundance plasma proteins with a concentration range of 50–2000 mg/L, displayed relative abundance change of more than 10%. The changes in protein abundance observed in this study allow further research on the role of several proteins in ischemia-reperfusion injury during liver transplantation and possibly in other surgery. PMID:22187521

  3. Germ cell DNA quantification shortly after IR laser radiation.

    PubMed

    Bermúdez, D; Carrasco, F; Diaz, F; Perez-de-Vargas, I

    1991-01-01

    The immediate effect of IR laser radiation on rat germ cells was studied by cytophotometric quantification of the nuclear DNA content in testicular sections. Two different levels of radiation were studied: one according to clinical application (28.05 J/cm2) and another known to increase the germ cell number (46.80 J/cm2). The laser beam induced changes in the germ cell DNA content depending on the cell type, the cell cycle phase and the doses of radiation energy applied. Following irradiation at both doses the percentage of spermatogonia showing a 4c DNA content was increased, while the percentage of these with a 2c DNA content was decreased. Likewise, the percentages of primary spermatocytes with a DNA content equal to 4c (at 28.05 J/cm2), between 2c and 4c (at 46.80 J/cm2) and higher than 4c (at both doses) were increased. No change in the mean spermatid DNA content was observed. Nevertheless, at 46.80 J/cm2 the percentages of elongated spermatids with a c or 2c DNA content differed from the controls. Data show that, even at laser radiation doses used in therapy, the germ cell DNA content is increased shortly after IR laser radiation. PMID:1772145

  4. Automated quantification of one-dimensional nanostructure alignment on surfaces

    NASA Astrophysics Data System (ADS)

    Dong, Jianjin; Goldthorpe, Irene A.; Mohieddin Abukhdeir, Nasser

    2016-06-01

    A method for automated quantification of the alignment of one-dimensional (1D) nanostructures from microscopy imaging is presented. Nanostructure alignment metrics are formulated and shown to be able to rigorously quantify the orientational order of nanostructures within a two-dimensional domain (surface). A complementary image processing method is also presented which enables robust processing of microscopy images where overlapping nanostructures might be present. Scanning electron microscopy (SEM) images of nanowire-covered surfaces are analyzed using the presented methods and it is shown that past single parameter alignment metrics are insufficient for highly aligned domains. Through the use of multiple parameter alignment metrics, automated quantitative analysis of SEM images is shown to be possible and the alignment characteristics of different samples are able to be quantitatively compared using a similarity metric. The results of this work provide researchers in nanoscience and nanotechnology with a rigorous method for the determination of structure/property relationships, where alignment of 1D nanostructures is significant.

  5. Methodological Issues in the Quantification of Respiratory Sinus Arrhythmia

    PubMed Central

    Denver, John W.; Reed, Shawn F.; Porges, Stephen W.

    2007-01-01

    Although respiratory sinus arrhythmia (RSA) is a commonly quantified physiological variable, the methods for quantification are not consistent. This manuscript questions the assumption that respiration frequency needs to be manipulated or monitored to generate an accurate measure of RSA amplitude. A review of recent papers is presented that contrast RSA amplitude with measures that use respiratory parameters to adjust RSA amplitude. In addition, data from two studies are presented to evaluate empirically both the relation between RSA amplitude and respiration frequency and the covariation between RSA frequency and respiration frequency. The literature review demonstrates similar findings between both classes of measures. The first study demonstrates, during spontaneous breathing without task demands, that there is no relation between respiration frequency and RSA amplitude and that respiration frequency can be accurately derived from the heart period spectrum (i.e., frequency of RSA). The second study demonstrates that respiration frequency is unaffected by atropine dose, a manipulation that systematically mediates the amplitude of RSA, and that the tight linkage between the RSA frequency and respiration frequency is unaffected by atropine. The research shows that the amplitude of RSA is not affected by respiration frequency under either baseline conditions or vagal manipulation via atropine injection. Respiration frequency is therefore unlikely to be a concern under these conditions. Research examining conditions that produce (causal) deviations from the intrinsic relation between respiratory parameters and the amplitude of RSA combined with appropriate statistical procedures for understanding these deviations are necessary. PMID:17067734

  6. Pesticide residue quantification analysis by hyperspectral imaging sensors

    NASA Astrophysics Data System (ADS)

    Liao, Yuan-Hsun; Lo, Wei-Sheng; Guo, Horng-Yuh; Kao, Ching-Hua; Chou, Tau-Meu; Chen, Junne-Jih; Wen, Chia-Hsien; Lin, Chinsu; Chen, Hsian-Min; Ouyang, Yen-Chieh; Wu, Chao-Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2015-05-01

    Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.

  7. Absolute Quantification of Selected Proteins in the Human Osteoarthritic Secretome

    PubMed Central

    Peffers, Mandy J.; Beynon, Robert J.; Clegg, Peter D.

    2013-01-01

    Osteoarthritis (OA) is characterized by a loss of extracellular matrix which is driven by catabolic cytokines. Proteomic analysis of the OA cartilage secretome enables the global study of secreted proteins. These are an important class of molecules with roles in numerous pathological mechanisms. Although cartilage studies have identified profiles of secreted proteins, quantitative proteomics techniques have been implemented that would enable further biological questions to be addressed. To overcome this limitation, we used the secretome from human OA cartilage explants stimulated with IL-1β and compared proteins released into the media using a label-free LC-MS/MS-based strategy. We employed QconCAT technology to quantify specific proteins using selected reaction monitoring. A total of 252 proteins were identified, nine were differentially expressed by IL-1 β stimulation. Selected protein candidates were quantified in absolute amounts using QconCAT. These findings confirmed a significant reduction in TIMP-1 in the secretome following IL-1β stimulation. Label-free and QconCAT analysis produced equivocal results indicating no effect of cytokine stimulation on aggrecan, cartilage oligomeric matrix protein, fibromodulin, matrix metalloproteinases 1 and 3 or plasminogen release. This study enabled comparative protein profiling and absolute quantification of proteins involved in molecular pathways pertinent to understanding the pathogenesis of OA. PMID:24132152

  8. Preparation, imaging, and quantification of bacterial surface motility assays.

    PubMed

    Morales-Soto, Nydia; Anyan, Morgen E; Mattingly, Anne E; Madukoma, Chinedu S; Harvey, Cameron W; Alber, Mark; Déziel, Eric; Kearns, Daniel B; Shrout, Joshua D

    2015-01-01

    Bacterial surface motility, such as swarming, is commonly examined in the laboratory using plate assays that necessitate specific concentrations of agar and sometimes inclusion of specific nutrients in the growth medium. The preparation of such explicit media and surface growth conditions serves to provide the favorable conditions that allow not just bacterial growth but coordinated motility of bacteria over these surfaces within thin liquid films. Reproducibility of swarm plate and other surface motility plate assays can be a major challenge. Especially for more "temperate swarmers" that exhibit motility only within agar ranges of 0.4%-0.8% (wt/vol), minor changes in protocol or laboratory environment can greatly influence swarm assay results. "Wettability", or water content at the liquid-solid-air interface of these plate assays, is often a key variable to be controlled. An additional challenge in assessing swarming is how to quantify observed differences between any two (or more) experiments. Here we detail a versatile two-phase protocol to prepare and image swarm assays. We include guidelines to circumvent the challenges commonly associated with swarm assay media preparation and quantification of data from these assays. We specifically demonstrate our method using bacteria that express fluorescent or bioluminescent genetic reporters like green fluorescent protein (GFP), luciferase (lux operon), or cellular stains to enable time-lapse optical imaging. We further demonstrate the ability of our method to track competing swarming species in the same experiment. PMID:25938934

  9. Microvascular quantification based on contour-scanning photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Yeh, Chenghung; Soetikno, Brian; Hu, Song; Maslov, Konstantin I.; Wang, Lihong V.

    2014-09-01

    Accurate quantification of microvasculature remains of interest in fundamental pathophysiological studies and clinical trials. Current photoacoustic microscopy can noninvasively quantify properties of the microvasculature, including vessel density and diameter, with a high spatial resolution. However, the depth range of focus (i.e., focal zone) of optical-resolution photoacoustic microscopy (OR-PAM) is often insufficient to encompass the depth variations of features of interest-such as blood vessels-due to uneven tissue surfaces. Thus, time-consuming image acquisitions at multiple different focal planes are required to maintain the region of interest in the focal zone. We have developed continuous three-dimensional motorized contour-scanning OR-PAM, which enables real-time adjustment of the focal plane to track the vessels' profile. We have experimentally demonstrated that contour scanning improves the signal-to-noise ratio of conventional OR-PAM by as much as 41% and shortens the image acquisition time by 3.2 times. Moreover, contour-scanning OR-PAM more accurately quantifies vessel density and diameter, and has been applied to studying tumors with uneven surfaces.

  10. Preparation, Imaging, and Quantification of Bacterial Surface Motility Assays

    PubMed Central

    Morales-Soto, Nydia; Anyan, Morgen E.; Mattingly, Anne E.; Madukoma, Chinedu S.; Harvey, Cameron W.; Alber, Mark; Déziel, Eric; Kearns, Daniel B.; Shrout, Joshua D.

    2015-01-01

    Bacterial surface motility, such as swarming, is commonly examined in the laboratory using plate assays that necessitate specific concentrations of agar and sometimes inclusion of specific nutrients in the growth medium. The preparation of such explicit media and surface growth conditions serves to provide the favorable conditions that allow not just bacterial growth but coordinated motility of bacteria over these surfaces within thin liquid films. Reproducibility of swarm plate and other surface motility plate assays can be a major challenge. Especially for more “temperate swarmers” that exhibit motility only within agar ranges of 0.4%-0.8% (wt/vol), minor changes in protocol or laboratory environment can greatly influence swarm assay results. “Wettability”, or water content at the liquid-solid-air interface of these plate assays, is often a key variable to be controlled. An additional challenge in assessing swarming is how to quantify observed differences between any two (or more) experiments. Here we detail a versatile two-phase protocol to prepare and image swarm assays. We include guidelines to circumvent the challenges commonly associated with swarm assay media preparation and quantification of data from these assays. We specifically demonstrate our method using bacteria that express fluorescent or bioluminescent genetic reporters like green fluorescent protein (GFP), luciferase (lux operon), or cellular stains to enable time-lapse optical imaging. We further demonstrate the ability of our method to track competing swarming species in the same experiment. PMID:25938934

  11. Ratiometric Raman Spectroscopy for Quantification of Protein Oxidative Damage

    PubMed Central

    Jiang, Dongping; Yanney, Michael; Zou, Sige; Sygula, Andrzej

    2009-01-01

    A novel ratiometric Raman spectroscopic (RMRS) method has been developed for quantitative determination of protein carbonyl levels. Oxidized bovine serum albumin (BSA) and oxidized lysozyme were used as model proteins to demonstrate this method. The technique involves conjugation of protein carbonyls with dinitrophenyl hydrazine (DNPH), followed by drop coating deposition Raman spectral acquisition (DCDR). The RMRS method is easy to implement as it requires only one conjugation reaction, a single spectral acquisition, and does not require sample calibration. Characteristic peaks from both protein and DNPH moieties are obtained in a single spectral acquisition, allowing the protein carbonyl level to be calculated from the peak intensity ratio. Detection sensitivity for the RMRS method is ~0.33 pmol carbonyl/measurement. Fluorescence and/or immunoassay based techniques only detect a signal from the labeling molecule and thus yield no structural or quantitative information for the modified protein while the RMRS technique provides for protein identification and protein carbonyl quantification in a single experiment. PMID:19457432

  12. A surrogate-based uncertainty quantification with quantifiable errors

    SciTech Connect

    Bang, Y.; Abdel-Khalik, H. S.

    2012-07-01

    Surrogate models are often employed to reduce the computational cost required to complete uncertainty quantification, where one is interested in propagating input parameters uncertainties throughout a complex engineering model to estimate responses uncertainties. An improved surrogate construction approach is introduced here which places a premium on reducing the associated computational cost. Unlike existing methods where the surrogate is constructed first, then employed to propagate uncertainties, the new approach combines both sensitivity and uncertainty information to render further reduction in the computational cost. Mathematically, the reduction is described by a range finding algorithm that identifies a subspace in the parameters space, whereby parameters uncertainties orthogonal to the subspace contribute negligible amount to the propagated uncertainties. Moreover, the error resulting from the reduction can be upper-bounded. The new approach is demonstrated using a realistic nuclear assembly model and compared to existing methods in terms of computational cost and accuracy of uncertainties. Although we believe the algorithm is general, it will be applied here for linear-based surrogates and Gaussian parameters uncertainties. The generalization to nonlinear models will be detailed in a separate article. (authors)

  13. Absolute Quantification of Individual Biomass Concentrations in a Methanogenic Coculture

    PubMed Central

    2014-01-01

    Identification of individual biomass concentrations is a crucial step towards an improved understanding of anaerobic digestion processes and mixed microbial conversions in general. The knowledge of individual biomass concentrations allows for the calculation of biomass specific conversion rates which form the basis of anaerobic digestion models. Only few attempts addressed the absolute quantification of individual biomass concentrations in methanogenic microbial ecosystems which has so far impaired the calculation of biomass specific conversion rates and thus model validation. This study proposes a quantitative PCR (qPCR) approach for the direct determination of individual biomass concentrations in methanogenic microbial associations by correlating the native qPCR signal (cycle threshold, Ct) to individual biomass concentrations (mg dry matter/L). Unlike existing methods, the proposed approach circumvents error-prone conversion factors that are typically used to convert gene copy numbers or cell concentrations into actual biomass concentrations. The newly developed method was assessed and deemed suitable for the determination of individual biomass concentrations in a defined coculture of Desulfovibrio sp. G11 and Methanospirillum hungatei JF1. The obtained calibration curves showed high accuracy, indicating that the new approach is well suited for any engineering applications where the knowledge of individual biomass concentrations is required. PMID:24949269

  14. Quantification of motility of carabid beetles in farmland.

    PubMed

    Allema, A B; van der Werf, W; Groot, J C J; Hemerik, L; Gort, G; Rossing, W A H; van Lenteren, J C

    2015-04-01

    Quantification of the movement of insects at field and landscape levels helps us to understand their ecology and ecological functions. We conducted a meta-analysis on movement of carabid beetles (Coleoptera: Carabidae), to identify key factors affecting movement and population redistribution. We characterize the rate of redistribution using motility μ (L2 T-1), which is a measure for diffusion of a population in space and time that is consistent with ecological diffusion theory and which can be used for upscaling short-term data to longer time frames. Formulas are provided to calculate motility from literature data on movement distances. A field experiment was conducted to measure the redistribution of mass-released carabid, Pterostichus melanarius in a crop field, and derive motility by fitting a Fokker-Planck diffusion model using inverse modelling. Bias in estimates of motility from literature data is elucidated using the data from the field experiment as a case study. The meta-analysis showed that motility is 5.6 times as high in farmland as in woody habitat. Species associated with forested habitats had greater motility than species associated with open field habitats, both in arable land and woody habitat. The meta-analysis did not identify consistent differences in motility at the species level, or between clusters of larger and smaller beetles. The results presented here provide a basis for calculating time-varying distribution patterns of carabids in farmland and woody habitat. The formulas for calculating motility can be used for other taxa. PMID:25673121

  15. Detection and quantification of MS lesions using fuzzy topological principles

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.

    1996-04-01

    Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.

  16. Uncertainty Quantification for Monitoring of Civil Structures from Vibration Measurements

    NASA Astrophysics Data System (ADS)

    Döhler, Michael; Mevel, Laurent

    2014-05-01

    Health Monitoring of civil structures can be performed by detecting changes in the modal parameters of a structure, or more directly in the measured vibration signals. For a continuous monitoring the excitation of a structure is usually ambient, thus unknown and assumed to be noise. Hence, all estimates from the vibration measurements are realizations of random variables with inherent uncertainty due to (unknown) process and measurement noise and finite data length. In this talk, a strategy for quantifying the uncertainties of modal parameter estimates from a subspace-based system identification approach is presented and the importance of uncertainty quantification in monitoring approaches is shown. Furthermore, a damage detection method is presented, which is based on the direct comparison of the measured vibration signals without estimating modal parameters, while taking the statistical uncertainty in the signals correctly into account. The usefulness of both strategies is illustrated on data from a progressive damage action on a prestressed concrete bridge. References E. Carden and P. Fanning. Vibration based condition monitoring: a review. Structural Health Monitoring, 3(4):355-377, 2004. M. Döhler and L. Mevel. Efficient multi-order uncertainty computation for stochastic subspace identification. Mechanical Systems and Signal Processing, 38(2):346-366, 2013. M. Döhler, L. Mevel, and F. Hille. Subspace-based damage detection under changes in the ambient excitation statistics. Mechanical Systems and Signal Processing, 45(1):207-224, 2014.

  17. Comprehensive quantification of ceramide species in human stratum corneum.

    PubMed

    Masukawa, Yoshinori; Narita, Hirofumi; Sato, Hirayuki; Naoe, Ayano; Kondo, Naoki; Sugai, Yoshiya; Oba, Tsuyoshi; Homma, Rika; Ishikawa, Junko; Takagi, Yutaka; Kitahara, Takashi

    2009-08-01

    One of the key challenges in lipidomics is to quantify lipidomes of interest, as it is practically impossible to collect all authentic materials covering the targeted lipidomes. For diverse ceramides (CER) in human stratum corneum (SC) that play important physicochemical roles in the skin, we developed a novel method for quantification of the overall CER species by improving our previously reported profiling technique using normal-phase liquid chromatography-electrospray ionization-mass spectrometry (NPLC-ESI-MS). The use of simultaneous selected ion monitoring measurement of as many as 182 kinds of molecular-related ions enables the highly sensitive detection of the overall CER species, as they can be analyzed in only one SC-stripped tape as small as 5 mm x 10 mm. To comprehensively quantify CERs, including those not available as authentic species, we designed a procedure to estimate their levels using relative responses of representative authentic species covering the species targeted, considering the systematic error based on intra-/inter-day analyses. The CER levels obtained by this method were comparable to those determined by conventional thin-layer chromatography (TLC), which guarantees the validity of this method. This method opens lipidomics approaches for CERs in the SC. PMID:19349641

  18. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    PubMed Central

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  19. VESGEN Software for Mapping and Quantification of Vascular Regulators

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.

    2012-01-01

    VESsel GENeration (VESGEN) Analysis is an automated software that maps and quantifies effects of vascular regulators on vascular morphology by analyzing important vessel parameters. Quantification parameters include vessel diameter, length, branch points, density, and fractal dimension. For vascular trees, measurements are reported as dependent functions of vessel branching generation. VESGEN maps and quantifies vascular morphological events according to fractal-based vascular branching generation. It also relies on careful imaging of branching and networked vascular form. It was developed as a plug-in for ImageJ (National Institutes of Health, USA). VESGEN uses image-processing concepts of 8-neighbor pixel connectivity, skeleton, and distance map to analyze 2D, black-and-white (binary) images of vascular trees, networks, and tree-network composites. VESGEN maps typically 5 to 12 (or more) generations of vascular branching, starting from a single parent vessel. These generations are tracked and measured for critical vascular parameters that include vessel diameter, length, density and number, and tortuosity per branching generation. The effects of vascular therapeutics and regulators on vascular morphology and branching tested in human clinical or laboratory animal experimental studies are quantified by comparing vascular parameters with control groups. VESGEN provides a user interface to both guide and allow control over the users vascular analysis process. An option is provided to select a morphological tissue type of vascular trees, network or tree-network composites, which determines the general collections of algorithms, intermediate images, and output images and measurements that will be produced.

  20. Accurate quantification of astaxanthin from Haematococcus crude extract spectrophotometrically

    NASA Astrophysics Data System (ADS)

    Li, Yeguang; Miao, Fengping; Geng, Yahong; Lu, Dayan; Zhang, Chengwu; Zeng, Mingtao

    2012-07-01

    The influence of alkali on astaxanthin and the optimal working wave length for measurement of astaxanthin from Haematococcus crude extract were investigated, and a spectrophotometric method for precise quantification of the astaxanthin based on the method of Boussiba et al. was established. According to Boussiba's method, alkali treatment destroys chlorophyll. However, we found that: 1) carotenoid content declined for about 25% in Haematococcus fresh cysts and up to 30% in dry powder of Haematococcus broken cysts after alkali treatment; and 2) dimethyl sulfoxide (DMSO)-extracted chlorophyll of green Haematococcus bares little absorption at 520-550 nm. Interestingly, a good linear relationship existed between absorbance at 530 nm and astaxanthin content, while an unknown interference at 540-550 nm was detected in our study. Therefore, with 530 nm as working wavelength, the alkali treatment to destroy chlorophyll was not necessary and the influence of chlorophyll, other carotenoids, and the unknown interference could be avoided. The astaxanthin contents of two samples were measured at 492 nm and 530 nm; the measured values at 530 nm were 2.617 g/100 g and 1.811 g/100 g. When compared with the measured values at 492 nm, the measured values at 530 nm decreased by 6.93% and 11.96%, respectively. The measured values at 530 nm are closer to the true astaxanthin contents in the samples. The data show that 530 nm is the most suitable wave length for spectrophotometric determination to the astaxanthin in Haematococcus crude extract.

  1. Immobilized Particle Imaging for Quantification of Nano- and Microparticles.

    PubMed

    Cui, Jiwei; Hibbs, Benjamin; Gunawan, Sylvia T; Braunger, Julia A; Chen, Xi; Richardson, Joseph J; Hanssen, Eric; Caruso, Frank

    2016-04-12

    The quantification of nano- and microparticles is critical for diverse applications relying on the exact knowledge of the particle concentration. Although many techniques are available for counting particles, there are some limitations in regards to counting with low-scattering materials and facile counting in harsh organic solvents. Herein, we introduce an easy and rapid particle counting technique, termed "immobilized particle imaging" (IPI), to quantify fluorescent particles with different compositions (i.e., inorganic or organic), structures (i.e., solid, porous, or hollow), and sizes (50-1000 nm) dispersed in either aqueous or organic solutions. IPI is achieved by immobilizing particles of interest in a cell matrix-like scaffold (e.g., agarose) and imaging using standard microscopy techniques. Imaging a defined volume of the immobilized particles allows for the particle concentration to be calculated from the count numbers in a fixed volume. IPI provides a general and facile approach to quantify advanced nano- and microparticles, which may be helpful to researchers to obtain new insights for different applications (e.g., nanomedicine). PMID:27032056

  2. Quantification of intracellular payload release from polymersome nanoparticles

    PubMed Central

    Scarpa, Edoardo; Bailey, Joanne L.; Janeczek, Agnieszka A.; Stumpf, Patrick S.; Johnston, Alexander H.; Oreffo, Richard O. C.; Woo, Yin L.; Cheong, Ying C.; Evans, Nicholas D.; Newman, Tracey A.

    2016-01-01

    Polymersome nanoparticles (PMs) are attractive candidates for spatio-temporal controlled delivery of therapeutic agents. Although many studies have addressed cellular uptake of solid nanoparticles, there is very little data available on intracellular release of molecules encapsulated in membranous carriers, such as polymersomes. Here, we addressed this by developing a quantitative assay based on the hydrophilic dye, fluorescein. Fluorescein was encapsulated stably in PMs of mean diameter 85 nm, with minimal leakage after sustained dialysis. No fluorescence was detectable from fluorescein PMs, indicating quenching. Following incubation of L929 cells with fluorescein PMs, there was a gradual increase in intracellular fluorescence, indicating PM disruption and cytosolic release of fluorescein. By combining absorbance measurements with flow cytometry, we quantified the real-time intracellular release of a fluorescein at a single-cell resolution. We found that 173 ± 38 polymersomes released their payload per cell, with significant heterogeneity in uptake, despite controlled synchronisation of cell cycle. This novel method for quantification of the release of compounds from nanoparticles provides fundamental information on cellular uptake of nanoparticle-encapsulated compounds. It also illustrates the stochastic nature of population distribution in homogeneous cell populations, a factor that must be taken into account in clinical use of this technology. PMID:27404770

  3. Alkylpyridiniums. 2. Isolation and quantification in roasted and ground coffees.

    PubMed

    Stadler, Richard H; Varga, Natalia; Milo, Christian; Schilter, Benoit; Vera, Francia Arce; Welti, Dieter H

    2002-02-27

    Recent model studies on trigonelline decomposition have identified nonvolatile alkylpyridiniums as major reaction products under certain physicochemical conditions. The quaternary base 1-methylpyridinium was isolated from roasted and ground coffee and purified by ion exchange and thin-layer chromatography. The compound was characterized by nuclear magnetic resonance spectroscopy ((1)H, (13)C) and mass spectrometry techniques. A liquid chromatography-electrospray ionization tandem mass spectrometry method was developed to quantify the alkaloid in coffee by isotope dilution mass spectrometry. The formation of alkylpyridiniums is positively correlated to the roasting degree in arabica coffee, and highest levels of 1-methylpyridinium, reaching up to 0.25% on a per weight basis, were found in dark roasted coffee beans. Analyses of coffee extracts also showed the presence of dimethylpyridinium, at concentrations ranging from 5 to 25 mg/kg. This is the first report on the isolation and quantification of alkylpyridiniums in coffee. These compounds, described here in detail for the first time, may have an impact on the flavor/aroma profile of coffee directly (e.g., bitterness), or indirectly as precursors, and potentially open new avenues in the flavor/aroma modulation of coffee. PMID:11853504

  4. Toward automated quantification of biological microstructures using unbiased stereology

    NASA Astrophysics Data System (ADS)

    Bonam, Om P.; Elozory, Daniel; Kramer, Kurt; Goldgof, Dmitry; Hall, Lawrence O.; Mangual, Osvaldo; Mouton, Peter R.

    2011-03-01

    Quantitative analysis of biological microstructures using unbiased stereology plays a large and growing role in bioscience research. Our aim is to add a fully automatic, high-throughput mode to a commercially available, computerized stereology device (Stereologer). The current method for estimation of first- and second order parameters of biological microstructures, requires a trained user to manually select objects of interest (cells, fibers etc.,) while stepping through the depth of a stained tissue section in fixed intervals. The proposed approach uses a combination of color and gray-level processing. Color processing is used to identify the objects of interest, by training on the images to obtain the threshold range for objects of interest. In gray-level processing, a region-growing approach was used to assign a unique identity to the objects of interest and enumerate them. This automatic approach achieved an overall object detection rate of 93.27%. Thus, these results support the view that automatic color and gray-level processing combined with unbiased sampling and assumption and model-free geometric probes can provide accurate and efficient quantification of biological objects.

  5. Rapid method for the quantification of hydroquinone concentration: chemiluminescent analysis.

    PubMed

    Chen, Tung-Sheng; Liou, Show-Yih; Kuo, Wei-Wen; Wu, Hsi-Chin; Jong, Gwo-Ping; Wang, Hsueh-Fang; Shen, Chia-Yao; Padma, V Vijaya; Huang, Chih-Yang; Chang, Yen-Lin

    2015-11-01

    Topical hydroquinone serves as a skin whitener and is usually available in cosmetics or on prescription based on the hydroquinone concentration. Quantification of hydroquinone content therefore becomes an important issue in topical agents. High-performance liquid chromatography (HPLC) is the commonest method for determining hydroquinone content in topical agents, but this method is time-consuming and uses many solvents that can become an environmental issue. We report a rapid method for quantifying hydroquinone content by chemiluminescent analysis. Hydroquinone induces the production of hydrogen peroxide in the presence of basic compounds. Hydrogen peroxide induced by hydroquinone oxidized light-emitting materials such as lucigenin, resulted in the production of ultra-weak chemiluminescence that was detected by a chemiluminescence analyzer. The intensity of the chemiluminescence was found to be proportional to the hydroquinone concentration. We suggest that the rapid (measurement time, 60 s) and virtually solvent-free (solvent volume, <2 mL) chemiluminescent method described here for quantifying hydroquinone content may be an alternative to HPLC analysis. PMID:25693839

  6. Automated angiogenesis quantification through advanced image processing techniques.

    PubMed

    Doukas, Charlampos N; Maglogiannis, Ilias; Chatziioannou, Aristotle; Papapetropoulos, Andreas

    2006-01-01

    Angiogenesis, the formation of blood vessels in tumors, is an interactive process between tumor, endothelial and stromal cells in order to create a network for oxygen and nutrients supply, necessary for tumor growth. According to this, angiogenic activity is considered a suitable method for both tumor growth or inhibition detection. The angiogenic potential is usually estimated by counting the number of blood vessels in particular sections. One of the most popular assay tissues to study the angiogenesis phenomenon is the developing chick embryo and its chorioallantoic membrane (CAM), which is a highly vascular structure lining the inner surface of the egg shell. The aim of this study was to develop and validate an automated image analysis method that would give an unbiased quantification of the micro-vessel density and growth in angiogenic CAM images. The presented method has been validated by comparing automated results to manual counts over a series of digital chick embryo photos. The results indicate the high accuracy of the tool, which has been thus extensively used for tumor growth detection at different stages of embryonic development. PMID:17946107

  7. Mesh refinement for uncertainty quantification through model reduction

    SciTech Connect

    Li, Jing Stinis, Panos

    2015-01-01

    We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan–Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory.

  8. Uncertainty Quantification for Polynomial Systems via Bernstein Expansions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper presents a unifying framework to uncertainty quantification for systems having polynomial response metrics that depend on both aleatory and epistemic uncertainties. The approach proposed, which is based on the Bernstein expansions of polynomials, enables bounding the range of moments and failure probabilities of response metrics as well as finding supersets of the extreme epistemic realizations where the limits of such ranges occur. These bounds and supersets, whose analytical structure renders them free of approximation error, can be made arbitrarily tight with additional computational effort. Furthermore, this framework enables determining the importance of particular uncertain parameters according to the extent to which they affect the first two moments of response metrics and failure probabilities. This analysis enables determining the parameters that should be considered uncertain as well as those that can be assumed to be constants without incurring significant error. The analytical nature of the approach eliminates the numerical error that characterizes the sampling-based techniques commonly used to propagate aleatory uncertainties as well as the possibility of under predicting the range of the statistic of interest that may result from searching for the best- and worstcase epistemic values via nonlinear optimization or sampling.

  9. Microbial Maintenance: A Critical Review on Its Quantification

    PubMed Central

    2007-01-01

    Microbial maintenance is an important concept in microbiology. Its quantification, however, is a subject of continuous debate, which seems to be caused by (1) its definition, which includes nongrowth components other than maintenance; (2) the existence of partly overlapping concepts; (3) the evolution of variables as constants; and (4) the neglect of cell death in microbial dynamics. The two historically most important parameters describing maintenance, the specific maintenance rate and the maintenance coefficient, are based on partly different nongrowth components. There is thus no constant relation between these parameters and previous equations on this subject are wrong. In addition, the partial overlap between these parameters does not allow the use of a simple combination of these parameters. This also applies for combinations of a threshold concentration with one of the other estimates of maintenance. Maintenance estimates should ideally explicitly describe each nongrowth component. A conceptual model is introduced that describes their relative importance and reconciles the various concepts and definitions. The sensitivity of maintenance on underlying components was analyzed and indicated that overall maintenance depends nonlinearly on relative death rates, relative growth rates, growth yield, and endogenous metabolism. This quantitative sensitivity analysis explains the felt need to develop growth-dependent adaptations of existing maintenance parameters, and indicates the importance of distinguishing the various nongrowth components. Future experiments should verify the sensitivity of maintenance components under cellular and environmental conditions. PMID:17333428

  10. A posteriori uncertainty quantification of PIV-based pressure data

    NASA Astrophysics Data System (ADS)

    Azijli, Iliass; Sciacchitano, Andrea; Ragni, Daniele; Palha, Artur; Dwight, Richard P.

    2016-05-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from the prior distribution (prior knowledge of properties of the velocity field, e.g., divergence-free) and the statistical model of PIV measurement uncertainty. Once the posterior covariance matrix of the velocity is known, it is propagated through the discretized Poisson equation for pressure. Numerical assessment of the proposed method on a steady Lamb-Oseen vortex shows excellent agreement with Monte Carlo simulations, while linear uncertainty propagation underestimates the uncertainty in the pressure by up to 30 %. The method is finally applied to an experimental test case of a turbulent boundary layer in air, obtained using time-resolved tomographic PIV. Simultaneously with the PIV measurements, microphone measurements were carried out at the wall. The pressure reconstructed from the tomographic PIV data is compared to the microphone measurements. Realizing that the uncertainty of the latter is significantly smaller than the PIV-based pressure, this allows us to obtain an estimate for the true error of the former. The comparison between true error and estimated uncertainty demonstrates the accuracy of the uncertainty estimates on the pressure. In addition, enforcing the divergence-free constraint is found to result in a significantly more accurate reconstructed pressure field. The estimated uncertainty confirms this result.

  11. Uncertainty quantification for personalized analyses of human proximal femurs.

    PubMed

    Wille, Hagen; Ruess, Martin; Rank, Ernst; Yosibash, Zohar

    2016-02-29

    Computational models for the personalized analysis of human femurs contain uncertainties in bone material properties and loads, which affect the simulation results. To quantify the influence we developed a probabilistic framework based on polynomial chaos (PC) that propagates stochastic input variables through any computational model. We considered a stochastic E-ρ relationship and a stochastic hip contact force, representing realistic variability of experimental data. Their influence on the prediction of principal strains (ϵ1 and ϵ3) was quantified for one human proximal femur, including sensitivity and reliability analysis. Large variabilities in the principal strain predictions were found in the cortical shell of the femoral neck, with coefficients of variation of ≈40%. Between 60 and 80% of the variance in ϵ1 and ϵ3 are attributable to the uncertainty in the E-ρ relationship, while ≈10% are caused by the load magnitude and 5-30% by the load direction. Principal strain directions were unaffected by material and loading uncertainties. The antero-superior and medial inferior sides of the neck exhibited the largest probabilities for tensile and compression failure, however all were very small (pf<0.001). In summary, uncertainty quantification with PC has been demonstrated to efficiently and accurately describe the influence of very different stochastic inputs, which increases the credibility and explanatory power of personalized analyses of human proximal femurs. PMID:26873282

  12. Uranium quantification in semen by inductively coupled plasma mass spectrometry.

    PubMed

    Todorov, Todor I; Ejnik, John W; Guandalini, Gustavo; Xu, Hanna; Hoover, Dennis; Anderson, Larry; Squibb, Katherine; McDiarmid, Melissa A; Centeno, Jose A

    2013-01-01

    In this study we report uranium analysis for human semen samples. Uranium quantification was performed by inductively coupled plasma mass spectrometry. No additives, such as chymotrypsin or bovine serum albumin, were used for semen liquefaction, as they showed significant uranium content. For method validation we spiked 2g aliquots of pooled control semen at three different levels of uranium: low at 5 pg/g, medium at 50 pg/g, and high at 1000 pg/g. The detection limit was determined to be 0.8 pg/g uranium in human semen. The data reproduced within 1.4-7% RSD and spike recoveries were 97-100%. The uranium level of the unspiked, pooled control semen was 2.9 pg/g of semen (n=10). In addition six semen samples from a cohort of Veterans exposed to depleted uranium (DU) in the 1991 Gulf War were analyzed with no knowledge of their exposure history. Uranium levels in the Veterans' semen samples ranged from undetectable (<0.8 pg/g) to 3350 pg/g. This wide concentration range for uranium in semen is consistent with known differences in current DU body burdens in these individuals, some of whom have retained embedded DU fragments. PMID:22944582

  13. Quantification of the degree of reaction of fly ash

    SciTech Connect

    Ben Haha, M.; De Weerdt, K.; Lothenbach, B.

    2010-11-15

    The quantification of the fly ash (FA) in FA blended cements is an important parameter to understand the effect of the fly ash on the hydration of OPC and on the microstructural development. The FA reaction in two different blended OPC-FA systems was studied using a selective dissolution technique based on EDTA/NaOH, diluted NaOH solution, the portlandite content and by backscattered electron image analysis. The amount of FA determined by selective dissolution using EDTA/NaOH is found to be associated with a significant possible error as different assumptions lead to large differences in the estimate of FA reacted. In addition, at longer hydration times, the reaction of the FA is underestimated by this method due to the presence of non-dissolved hydrates and MgO rich particles. The dissolution of FA in diluted NaOH solution agreed during the first days well with the dissolution as observed by image analysis. At 28 days and longer, the formation of hydrates in the diluted solutions leads to an underestimation. Image analysis appears to give consistent results and to be most reliable technique studied.

  14. XPS quantification of the hetero-junction interface energy

    NASA Astrophysics Data System (ADS)

    Ma, Z. S.; Wang, Yan; Huang, Y. L.; Zhou, Z. F.; Zhou, Y. C.; Zheng, Weitao; Sun, Chang Q.

    2013-01-01

    We present an approach for quantifying the heterogeneous interface bond energy using X-ray photoelectron spectroscopy (XPS). Firstly, from analyzing the XPS core-level shift of the elemental surfaces we obtained the energy levels of an isolated atom and their bulk shifts of the constituent elements for reference; then we measured the energy shifts of the specific energy levels upon interface alloy formation. Subtracting the referential spectrum from that collected from the alloy, we can distil the interface effect on the binding energy. Calibrated based on the energy levels and their bulk shifts derived from elemental surfaces, we can derive the bond energy, energy density, atomic cohesive energy, and free energy at the interface region. This approach has enabled us to clarify the dominance of quantum entrapment at CuPd interface and the dominance of polarization at AgPd and BeW interfaces, as the origin of interface energy change. Developed approach not only enhances the power of XPS but also enables the quantification of the interface energy at the atomic scale that has been an issue of long challenge.

  15. Crystal Violet and XTT Assays on Staphylococcus aureus Biofilm Quantification.

    PubMed

    Xu, Zhenbo; Liang, Yanrui; Lin, Shiqi; Chen, Dingqiang; Li, Bing; Li, Lin; Deng, Yang

    2016-10-01

    Staphylococcus aureus (S. Aureus) is a common food-borne pathogenic microorganism. Biofilm formation remains the major obstruction for bacterial elimination. The study aims at providing a basis for determining S. aureus biofilm formation. 257 clinical samples of S. aureus isolates were identified by routine analysis and multiplex PCR detection and found to contain 227 MRSA, 16 MSSA, 11 MRCNS, and 3 MSCNS strains. Two assays for quantification of S. aureus biofilm formation, the crystal violet (CV) assay and the XTT (tetrazolium salt reduction) assay, were optimized, evaluated, and further compared. In CV assay, most isolates formed weak biofilm 74.3 %), while the rest formed moderate biofilm (23.3 %) or strong biofilm (2.3 %). However, most isolates in XTT assay showed weak metabolic activity (77.0 %), while the rest showed moderate metabolic activity (17.9 %) or high metabolic activity (5.1 %). In this study, we found a distinct strain-to-strain dissimilarity in terms of both biomass formation and metabolic activity, and it was concluded from this study that two assays were mutual complementation rather than being comparison. PMID:27324342

  16. Method for Indirect Quantification of CH4 Production via H2O Production Using Hydrogenotrophic Methanogens.

    PubMed

    Taubner, Ruth-Sophie; Rittmann, Simon K-M R

    2016-01-01

    Hydrogenotrophic methanogens are an intriguing group of microorganisms from the domain Archaea. Methanogens exhibit extraordinary ecological, biochemical, and physiological characteristics and possess a huge biotechnological potential. Yet, the only possibility to assess the methane (CH4) production potential of hydrogenotrophic methanogens is to apply gas chromatographic quantification of CH4. In order to be able to effectively screen pure cultures of hydrogenotrophic methanogens regarding their CH4 production potential we developed a novel method for indirect quantification of the volumetric CH4 production rate by measuring the volumetric water production rate. This method was established in serum bottles for cultivation of methanogens in closed batch cultivation mode. Water production was estimated by determining the difference in mass increase in a quasi-isobaric setting. This novel CH4 quantification method is an accurate and precise analytical technique, which can be used to rapidly screen pure cultures of methanogens regarding their volumetric CH4 evolution rate. It is a cost effective alternative determining CH4 production of methanogens over CH4 quantification by using gas chromatography, especially if applied as a high throughput quantification method. Eventually, the method can be universally applied for quantification of CH4 production from psychrophilic, thermophilic and hyperthermophilic hydrogenotrophic methanogens. PMID:27199898

  17. Method for Indirect Quantification of CH4 Production via H2O Production Using Hydrogenotrophic Methanogens

    PubMed Central

    Taubner, Ruth-Sophie; Rittmann, Simon K.-M. R.

    2016-01-01

    Hydrogenotrophic methanogens are an intriguing group of microorganisms from the domain Archaea. Methanogens exhibit extraordinary ecological, biochemical, and physiological characteristics and possess a huge biotechnological potential. Yet, the only possibility to assess the methane (CH4) production potential of hydrogenotrophic methanogens is to apply gas chromatographic quantification of CH4. In order to be able to effectively screen pure cultures of hydrogenotrophic methanogens regarding their CH4 production potential we developed a novel method for indirect quantification of the volumetric CH4 production rate by measuring the volumetric water production rate. This method was established in serum bottles for cultivation of methanogens in closed batch cultivation mode. Water production was estimated by determining the difference in mass increase in a quasi-isobaric setting. This novel CH4 quantification method is an accurate and precise analytical technique, which can be used to rapidly screen pure cultures of methanogens regarding their volumetric CH4 evolution rate. It is a cost effective alternative determining CH4 production of methanogens over CH4 quantification by using gas chromatography, especially if applied as a high throughput quantification method. Eventually, the method can be universally applied for quantification of CH4 production from psychrophilic, thermophilic and hyperthermophilic hydrogenotrophic methanogens. PMID:27199898

  18. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis

    PubMed Central

    Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range. PMID:26665161

  19. Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay

    PubMed Central

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997

  20. PSAQ™ standards for accurate MS-based quantification of proteins: from the concept to biomedical applications.

    PubMed

    Picard, Guillaume; Lebert, Dorothée; Louwagie, Mathilde; Adrait, Annie; Huillet, Céline; Vandenesch, François; Bruley, Christophe; Garin, Jérôme; Jaquinod, Michel; Brun, Virginie

    2012-10-01

    Absolute protein quantification, i.e. determining protein concentrations in biological samples, is essential to our understanding of biological and physiopathological phenomena. Protein quantification methods based on the use of antibodies are very effective and widely used. However, over the last ten years, absolute protein quantification by mass spectrometry has attracted considerable interest, particularly for the study of systems biology and as part of biomarker development. This interest is mainly linked to the high multiplexing capacity of MS analysis, and to the availability of stable-isotope-labelled standards for quantification. This article describes the details of how to produce, control the quality and use a specific type of standard: Protein Standard Absolute Quantification (PSAQ™) standards. These standards are whole isotopically labelled proteins, analogues of the proteins to be assayed. PSAQ standards can be added early during sample treatment, thus they can correct for protein losses during sample prefractionation and for incomplete sample digestion. Because of this, quantification of target proteins is very accurate and precise using these standards. To illustrate the advantages of the PSAQ method, and to contribute to the increase in its use, selected applications in the biomedical field are detailed here. PMID:23019168

  1. Christiansen Revisited: Rethinking Quantification of Uniformity in Rainfall Simulator Studies

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian

    2016-04-01

    Rainfall simulators, whether based within a laboratory or field setting are used extensively within a number of fields of research, including plot-scale runoff, infiltration and erosion studies, irrigation and crop management and scaled investigations into urban flooding. Rainfall simulators offer a number of benefits, including the ability to create regulated and repeatable rainfall characteristics (e.g. intensity, duration, drop size distribution and kinetic energy) without relying on unpredictable natural precipitation regimes. Ensuring and quantifying spatially uniform simulated rainfall across the entirety of the plot area is of particular importance to researchers undertaking rainfall simulation. As a result, numerous studies have focused on the quantification and improvement of uniformity values. Several statistical methods for the assessment of rainfall simulator uniformity have been developed. However, the Christiansen Uniformity Coefficient (CUC) suggested by Christiansen (1942) is most frequently used. Despite this, there is no set methodology and researchers can adapt or alter factors such as the quantity, as well as the spacing, distance and location of the measuring beakers used to derive CUC values. Because CUC values are highly sensitive to the resolution of the data, i.e. the number of observations taken, many densely distributed measuring containers subjected to the same experimental conditions may generate a significantly lower CUC value than fewer, more sparsely distributed measuring containers. Thus, the simulated rainfall under a higher resolution sampling method could appear less uniform than when using a coarser resolution sampling method, despite being derived from the same initial rainfall conditions. Expressing entire plot uniformity as a single, simplified percentage value disregards valuable qualitative information about plot uniformity, such as the small-scale spatial distribution of rainfall over the plot surface and whether these

  2. Bacterial adhesion force quantification by fluidic force microscopy

    NASA Astrophysics Data System (ADS)

    Potthoff, Eva; Ossola, Dario; Zambelli, Tomaso; Vorholt, Julia A.

    2015-02-01

    Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many cells. The contact time and setpoint dependence of the adhesion forces of E. coli and Streptococcus pyogenes, as well as the sequential detachment of bacteria out of a chain, are shown, revealing distinct force patterns in the detachment curves. This study demonstrates the potential of the FluidFM technology for quantitative bacterial adhesion measurements of cell-substrate and cell-cell interactions that are relevant in biofilms and infection biology.Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many

  3. Quantification of structural distortions in the transmembrane helices of GPCRs.

    PubMed

    Deupi, Xavier

    2012-01-01

    A substantial part of the structural and much of the functional information about G protein-coupled receptors (GPCRs) comes from studies on rhodopsin. Thus, analysis tools for detailed structure comparison are key to see to what extent this information can be extended to other GPCRs. Among the methods to evaluate protein structures and, in particular, helix distortions, HELANAL has the advantage that it provides data (local bend and twist angles) that can be easily translated to structural effects, as a local opening/tightening of the helix.In this work I show how HELANAL can be used to extract detailed structural information of the transmembrane bundle of GPCRs, and I provide some examples on how these data can be interpreted to study basic principles of protein structure, to compare homologous proteins and to study mechanisms of receptor activation. Also, I show how in combination with the sequence analysis tools provided by the program GMoS, distortions in individual receptors can be put in the context of the whole Class A GPCR family. Specifically, quantification of the strong proline-induced distortions in the transmembrane bundle of rhodopsin shows that they are not standard proline kinks. Moreover, the helix distortions in transmembrane helix (TMH) 5 and TMH 6 of rhodopsin are also present in the rest of GPCR crystal structures obtained so far, and thus, rhodopsin-based homology models have modeled correctly these strongly distorted helices. While in some cases the inherent "rhodopsin bias" of many of the GPCR models to date has not been a disadvantage, the availability of more templates will clearly result in better homology models. This type of analysis can be, of course, applied to any protein, and it may be particularly useful for the structural analysis of other membrane proteins. A detailed knowledge of the local structural changes related to ligand binding and how they are translated into larger-scale movements of transmembrane domains is key to

  4. Identification and Quantification of Volatile Organic Compounds at a Dairy

    NASA Astrophysics Data System (ADS)

    Filipy, J.; Mount, G.; Westberg, H.; Rumburg, B.

    2003-12-01

    Livestock operations in the United States are an escalating environmental concern. The increasing density of livestock within a farm results in an increased emission of odorous gases, which have gained considerable attention by the public in recent years (National Research Council (NRC), 2002). Odorous compounds such as ammonia (NH3), volatile organic compounds (VOC's), and hydrogen sulfide (H2S) were reported to have a major effect on the quality of life of local residents living near livestock facilities (NRC, 2002). There has been little data collected related to identification and quantification of gaseous compounds collected from open stall dairy operations in the United States. The research to be presented identifies and quantifies VOCs produced from a dairy operation that contribute to odor and other air quality problems. Many different VOCs were identified in the air downwind of an open lactating cow stall area and near a waste lagoon at the Washington State University dairy using Gas Chromatography Mass Spectroscopy (GC-MS) analysis techniques. Identified compounds were very diverse and included many alcohols, aldehydes, amines, aromatics, esters, ethers, a fixed gas, halogenated hydrocarbons, hydrocarbons, ketones, other nitrogen containing compounds, sulfur containing compounds, and terpenes. The VOCs directly associated with cattle waste were dependent on ambient temperature, with the highest emissions produced during the summer months. Low to moderate wind speeds were ideal for VOC collection. Concentrations of quantified compounds were mostly below odor detection thresholds found in the literature, however the combined odor magnitude of the large number of compounds detected was most likely above any minimum detection threshold.

  5. Uncertainty Quantification of Calculated Temperatures for the AGR-1 Experiment

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson; Grant L. Hawkes

    2012-04-01

    This report documents an effort to quantify the uncertainty of the calculated temperature data for the first Advanced Gas Reactor (AGR-1) fuel irradiation experiment conducted in the INL's Advanced Test Reactor (ATR) in support of the Next Generation Nuclear Plant (NGNP) R&D program. Recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, the results of the numerical simulations can be used in combination with the statistical analysis methods to improve qualification of measured data. Additionally, the temperature simulation data for AGR tests can be used for validation of the fuel transport and fuel performance simulation models. The crucial roles of the calculated fuel temperatures in ensuring achievement of the AGR experimental program objectives require accurate determination of the model temperature uncertainties. The report is organized into three chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program and provides overviews of AGR-1 measured data, AGR-1 test configuration and test procedure, and thermal simulation. Chapters 2 describes the uncertainty quantification procedure for temperature simulation data of the AGR-1 experiment, namely, (i) identify and quantify uncertainty sources; (ii) perform sensitivity analysis for several thermal test conditions; (iii) use uncertainty propagation to quantify overall response temperature uncertainty. A set of issues associated with modeling uncertainties resulting from the expert assessments are identified. This also includes the experimental design to estimate the main effects and interactions of the important thermal model parameters. Chapter 3 presents the overall uncertainty results for the six AGR-1 capsules. This includes uncertainties for the daily volume-average and peak fuel temperatures, daily average temperatures at TC locations, and time-average volume-average and time-average peak fuel temperatures.

  6. Quantification of plant chlorophyll content using Google Glass.

    PubMed

    Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan

    2015-04-01

    Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation. PMID:25669673

  7. Automated Drusen Segmentation and Quantification in SD-OCT Images

    PubMed Central

    Chen, Qiang; Leng, Theodore; Zheng, Luoluo; Kutzscher, Lauren; Ma, Jeffrey; de Sisternes, Luis; Rubin, Daniel L.

    2013-01-01

    Spectral domain optical coherence tomography (SD-OCT) is a useful tool for the visualization of drusen, a retinal abnormality seen in patients with age-related macular degeneration (AMD); however, objective assessment of drusen is thwarted by the lack of a method to robustly quantify these lesions on serial OCT images. Here, we describe an automatic drusen segmentation method for SD-OCT retinal images, which leverages a priori knowledge of normal retinal morphology and anatomical features. The highly reflective and locally connected pixels located below the retinal nerve fiber layer (RNFL) are used to generate a segmentation of the retinal pigment epithelium (RPE) layer. The observed and expected contours of the RPE layer are obtained by interpolating and fitting the shape of the segmented RPE layer, respectively. The areas located between the interpolated and fitted RPE shapes (which have nonzero area when drusen occurs) are marked as drusen. To enhance drusen quantification, we also developed a novel method of retinal projection to generate an en face retinal image based on the RPE extraction, which improves the quality of drusen visualization over the current approach to producing retinal projections from SD-OCT images based on a summed-voxel projection (SVP), and it provides a means of obtaining quantitative features of drusen in the en face projection. Visualization of the segmented drusen is refined through several post-processing steps, drusen detection to eliminate false positive detections on consecutive slices, drusen refinement on a projection view of drusen, and drusen smoothing. Experimental evaluation results demonstrate that our method is effective for drusen segmentation. In a preliminary analysis of the potential clinical utility of our methods, quantitative drusen measurements, such as area and volume, can be correlated with the drusen progression in non-exudative AMD, suggesting that our approach may produce useful quantitative imaging biomarkers

  8. Concepts and Practice of Verification, Validation, and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Oberkampf, W. L.

    2014-12-01

    Verification and validation (V&V) are the primary means to assess the numerical and physics modeling accuracy, respectively, in computational simulation. Code verification assesses the reliability of the software coding and the numerical algorithms used in obtaining a solution, while solution verification addresses numerical error estimation of the computational solution of a mathematical model for a specified set of initial and boundary conditions. Validation assesses the accuracy of the mathematical model as compared to experimentally measured response quantities of the system being modeled. As these experimental data are typically available only for simplified subsystems or components of the system, model validation commonly provides limited ability to assess model accuracy directly. Uncertainty quantification (UQ), specifically in regard to predictive capability of a mathematical model, attempts to characterize and estimate the total uncertainty for conditions where no experimental data are available. Specific sources of uncertainty that can impact the total predictive uncertainty are: the assumptions and approximations in the formulation of the mathematical model, the error incurred in the numerical solution of the discretized model, the information available for stochastic input data for the system, and the extrapolation of the mathematical model to conditions where no experimental data are available. This presentation will briefly discuss the principles and practices of VVUQ from both the perspective of computational modeling and simulation, as well as the difficult issue of estimating predictive capability. Contrasts will be drawn between weak and strong code verification testing, and model validation as opposed to model calibration. Closing remarks will address what needs to be done to improve the value of information generated by computational simulation for improved decision-making.

  9. An anatomically realistic brain phantom for quantification with positron tomography

    SciTech Connect

    Wong, D.F.; Links, J.M.; Molliver, M.E.; Hengst, T.C.; Clifford, C.M.; Buhle, L.; Bryan, M.; Stumpf, M.; Wagner, H.N. Jr.

    1984-01-01

    Phantom studies are useful in assessing and maximizing the accuracy and precision of quantification of absolute activity, assessing errors associated with patient positioning, and dosimetry. Most phantoms are limited by the use of simple shapes, which do not adequately reflect real anatomy. The authors have constructed an anatomically realistic life-size brain phantom for positron tomography studies. The phantom consists of separately fillable R + L caudates, R + L putamens, R + L globus passidus and cerebellum. These structures are contained in proper anatomic orientation within a fillable cerebrum. Solid ventricles are also present. The entire clear vinyl cerebrum is placed in a human skull. The internal brain structures were fabricated from polyester resin, with dimensions, shapes and sizes of the structures obtained from digitized contours of brain slices in the U.C.S.D. computerized brain atlas. The structures were filled with known concentrations of Ga-68 in water and scanned with our NeuroECAT. The phantom was aligned in the scanner for each structure, such that the tomographic slice passed through that structure's center. After calibration of the scanner with a standard phantom for counts/pixel uCi/cc conversion, the measured activity concentrations were compared with the actual concentrations. The ratio of measured to actual activity concentration (''recovery coefficient'') for the caudate was 0.33; for the putamen 0.42. For comparison, the ratio for spheres of diameters 9.5, 16,19 and 25.4 mm was 0.23, 0.54, 0.81, and 0.93. This phantom provides more realistic assessment of performance and allows calculation of correction factors.

  10. Uncertainty Quantification for CO2-Enhanced Oil Recovery

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Middleton, R.; Bauman, J.; Viswanathan, H.; Fessenden-Rahn, J.; Pawar, R.; Lee, S.

    2013-12-01

    CO2-Enhanced Oil Recovery (EOR) is currently an option for permanently sequestering CO2 in oil reservoirs while increasing oil/gas productions economically. In this study we have developed a framework for understanding CO2 storage potential within an EOR-sequestration environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. By coupling a EOR tool--SENSOR (CEI, 2011) with a uncertainty quantification tool PSUADE (Tong, 2011), we conduct an integrated Monte Carlo simulation of water, oil/gas components and CO2 flow and reactive transport in the heterogeneous Morrow formation to identify the key controlling processes and optimal parameters for CO2 sequestration and EOR. A global sensitivity and response surface analysis are conducted with PSUADE to build numerically the relationship among CO2 injectivity, oil/gas production, reservoir parameters and distance between injection and production wells. The results indicate that the reservoir permeability and porosity are the key parameters to control the CO2 injection, oil and gas (CH4) recovery rates. The distance between the injection and production wells has large impact on oil and gas recovery and net CO2 injection rates. The CO2 injectivity increases with the increasing reservoir permeability and porosity. The distance between injection and production wells is the key parameter for designing an EOR pattern (such as a five (or nine)-spot pattern). The optimal distance for a five-spot-pattern EOR in this site is estimated from the response surface analysis to be around 400 meters. Next, we are building the machinery into our risk assessment framework CO2-PENS to utilize these response surfaces and evaluate the operation risk for CO2 sequestration and EOR at this site.

  11. Quantification of confocal images of human corneal endothelium

    NASA Astrophysics Data System (ADS)

    Laird, Jeffery A.; Beuerman, Roger W.; Kaufman, Stephen C.

    1996-05-01

    Real-time, in vivo, confocal microscopic examination permits visualization of human corneal endothelium cells as bright bodies organized into a densely packed hexagonal arrangement. Quantification of endothelial cell number would be useful in assessing the condition of this cell layer in various disease states. In this study, we sought to use an image analysis method developed in this laboratory that utilizes digital filtering techniques and morphological operations to determine the boundaries of each cell. Images were corrected to establish a uniform luminance level, and then convolved by various matrices until distinct peaks in luminance value were identified. These peaks were used as seed points from which cell boundaries were recursively expanded until they collided with other cell boundaries. This method automatically counts the number of cells and determines the size and position of each cell. The resulting histograms of cell size are readily indicative of changes in cellular density, cell loss, and deviation from uniform arrangement. The numbers of cells counted by this method are consistently within 3% of the numbers counted manually. Results relating cell counts obtained by manual and computerized methods are as follows: 200/184; 276/262; 87/87; 234/232; 236/232; 299/297; 145/147; 119/122; 237/243; 119/119; 245/253; 189/193. Thus, confocal microscopy coupled with these image analysis and statistical procedures provides an accurate quantitative approach to monitoring the endothelium under normal, pathological, and experimental conditions, such as those following surgery and trauma or in the evaluation of the efficacy of topical therapeutic agents.

  12. Simultaneous quantification and qualification of synacthen in plasma.

    PubMed

    Chaabo, Ayman; de Ceaurriz, Jacques; Buisson, Corinne; Tabet, Jean-Claude; Lasne, Françoise

    2011-02-01

    Tetracosactide (Synacthen), a synthetic analogue of adrenocorticotropic hormone (ACTH), can be used as a doping agent to increase the secretion of glucocorticoids by adrenal glands. The only published method for anti-doping control of this drug in plasma relies on purification by immunoaffinity chromatography and LC/MS/MS analysis. Its limit of detection is 300 pg/mL, which corresponds to the peak value observed 12 h after 1 mg Synacthen IM administration. We report here a more sensitive method based on preparation of plasma by cation exchange chromatography and solid-phase extraction and analysis by LC/MS/MS with positive-mode electrospray ionization using 7-38 ACTH as internal standard. Identification of Synacthen was performed using two product ions, m/z 671.5 and m/z 223.0, from the parent [M + 5H](5+) ion, m/z 587.4. The recovery was estimated at 70%. A linear calibration curve was obtained from 25 to 600 pg/mL (R² > 0.99). The lower limit of detection was 8 pg/mL (S/N > 3). The lower limit of quantification was 15 pg/mL (S/N > 10; CV% < 20%). The performance of the method was illustrated by an 8-h kinetic analysis of plasma samples from nine subjects submitted to IM injections of either Synacthen® (five subjects) or Synacthen® Depot, the slow-release form of the drug (four subjects). Concentrations of Synacthen between 16 and 310 pg/mL were observed. A sensitive method for quantitation of Synacthen in plasma is proposed for anti-doping control analyses. PMID:21170520

  13. Improved Uncertainty Quantification in Groundwater Flux Estimation Using GRACE

    NASA Astrophysics Data System (ADS)

    Reager, J. T., II; Rao, P.; Famiglietti, J. S.; Turmon, M.

    2015-12-01

    Groundwater change is difficult to monitor over large scales. One of the most successful approaches is in the remote sensing of time-variable gravity using NASA Gravity Recovery and Climate Experiment (GRACE) mission data, and successful case studies have created the opportunity to move towards a global groundwater monitoring framework for the world's largest aquifers. To achieve these estimates, several approximations are applied, including those in GRACE processing corrections, the formulation of the formal GRACE errors, destriping and signal recovery, and the numerical model estimation of snow water, surface water and soil moisture storage states used to isolate a groundwater component. A major weakness in these approaches is inconsistency: different studies have used different sources of primary and ancillary data, and may achieve different results based on alternative choices in these approximations. In this study, we present two cases of groundwater change estimation in California and the Colorado River basin, selected for their good data availability and varied climates. We achieve a robust numerical estimate of post-processing uncertainties resulting from land-surface model structural shortcomings and model resolution errors. Groundwater variations should demonstrate less variability than the overlying soil moisture state does, as groundwater has a longer memory of past events due to buffering by infiltration and drainage rate limits. We apply a model ensemble approach in a Bayesian framework constrained by the assumption of decreasing signal variability with depth in the soil column. We also discuss time variable errors vs. time constant errors, across-scale errors v. across-model errors, and error spectral content (across scales and across model). More robust uncertainty quantification for GRACE-based groundwater estimates would take all of these issues into account, allowing for more fair use in management applications and for better integration of GRACE

  14. Local quantification of numerically-induced mixing and dissipation

    NASA Astrophysics Data System (ADS)

    Klingbeil, Knut; Mohammadi-Aragh, Mahdi; Gräwe, Ulf; Burchard, Hans

    2016-04-01

    The discretisation of the advection terms in transport equations introduces truncation errors in numerical models. These errors are usually associated with spurious diffusion, i.e. numerically-induced mixing of the advected quantities or dissipation of kinetic energy associated with the advection of momentum. Especially the numerically-induced diapycnal mixing part is very problematic for realistic model simulations. Since any diapycnal mixing of temperature and salinity increases the reference potential energy (RPE), numerically-induced mixing is often quantified in terms of RPE. However, this global bulk measure does not provide any information about the local amount of numerically-induced mixing of a single advected quantity. In this talk we will present a recently developed analysis method that quantifies the numerically-induced mixing of a single advected quantity locally (Klingbeil et al., 2014***). The method is based on the local tracer variance decay in terms of variance fluxes associated with the corresponding advective tracer fluxes. Because of its physically sound definition, this analysis method provides a reliable diagnostic tool, e.g., to assess the performance of advection schemes and to identify hotspots of numerically-induced mixing. At these identified positions the model could be adapted in terms of resolution or the applied numerical schemes. In this context we will demonstrate how numerically-induced mixing of temperature and salinity can be substantially reduced by vertical meshes adapting towards stratification. *** Klingbeil, K., M. Mohammadi-Aragh, U. Gräwe, H. Burchard (2014) . Quantification of spurious dissipation and mixing -- Discrete Variance Decay in a Finite-Volume framework. Ocean Modelling. doi:10.1016/j.ocemod.2014.06.001.

  15. Quantification of Plant Chlorophyll Content Using Google Glass

    PubMed Central

    Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan

    2015-01-01

    Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation. PMID:25669673

  16. Quantification of immunocompetent cells in testicular germ cell tumours.

    PubMed

    Torres, A; Casanova, J F; Nistal, M; Regadera, J

    1997-01-01

    The immunocompetent cells present in the different histological patterns of 43 testicular germ cell tumours were evaluated. CD3 + and CD45RO + (UCHL1 +) T lymphocytes, CD68 + and MAC 387 + macrophages, CD20 + (L26 +) B lymphocytes, and kappa and lambda + plasma cells were counted. The number of immunocompetent cells per mm2 of tumour tissue, excluding the necrotic areas, was evaluated. Microscopic fields were randomly selected by two observers. In order to guarantee randomization each surface was divided into parts, numbered through a lattice, and some fields were chosen via a random numbers table. This procedure yielded significantly different counts from those obtained on subjective selection. The number of T-lymphocytes and macrophages was higher in seminomas than in the non-seminomatous testicular germ cell tumours (P < 0.05) Embryonal carcinomas had more T-lymphocytes than immature teratomas. No significant differences were found among testicular germ cell tumours with regards to the B-lymphocytes, with the exception of the high number of B-lymphocytes in mature teratomas. Kappa + and lambda + plasma cells were few in the testicular germ cell tumours. Randomization in the quantification of immunocompetent cells in testicular germ cell tumours is a good means for evaluation of immune response in all the tumour mass, not only in the areas with the most intense inflammatory cell infiltrate, and permits comparison of testicular germ cell tumours with other malignant tumours. Study of immunocompetent cells in every histological type of testicular germ cell tumour is useful in comparing them with other extra-testicular germ cell tumours. PMID:9023554

  17. Combination Radioimmunotherapy Approaches and Quantification of Immuno-PET.

    PubMed

    Kim, Jin Su

    2016-06-01

    Monoclonal antibodies (mAbs), which play a prominent role in cancer therapy, can interact with specific antigens on cancer cells, thereby enhancing the patient's immune response via various mechanisms, or mAbs can act against cell growth factors and, thereby, arrest the proliferation of tumor cells. Radionuclide-labeled mAbs, which are used in radioimmunotherapy (RIT), are effective for cancer treatment because tumor associated-mAbs linked to cytotoxic radionuclides can selectively bind to tumor antigens and release targeted cytotoxic radiation. Immunological positron emission tomography (immuno-PET), which is the combination of PET with mAb, is an attractive option for improving tumor detection and mAb quantification. However, RIT remains a challenge because of the limited delivery of mAb into tumors. The transport and uptake of mAb into tumors is slow and heterogeneous. The tumor microenvironment contributed to the limited delivery of the mAb. During the delivery process of mAb to tumor, mechanical drug resistance such as collagen distribution or physiological drug resistance such as high intestinal pressure or absence of lymphatic vessel would be the limited factor of mAb delivery to the tumor at a potentially lethal mAb concentration. When α-emitter-labeled mAbs were used, deeper penetration of α-emitter-labeled mAb inside tumors was more important because of the short range of the α emitter. Therefore, combination therapy strategies aimed at improving mAb tumor penetration and accumulation would be beneficial for maximizing their therapeutic efficacy against solid tumors. PMID:27275358

  18. Dating and quantification of erosion processes based on exposed roots

    NASA Astrophysics Data System (ADS)

    Stoffel, Markus; Corona, Christophe; Ballesteros-Cánovas, Juan Antonio; Bodoque, José Maria

    2013-08-01

    Soil erosion is a key driver of land degradation and heavily affects sustainable land management in various environments worldwide. An appropriate quantification of rates of soil erosion and a localization of hotspots are therefore critical, as sediment loss has been demonstrated to have drastic consequences on soil productivity and fertility. A consistent body of evidence also exists for a causal linkage between global changes and the temporal frequency and magnitude of erosion, and thus calls for an improved understanding of dynamics and rates of soil erosion for an appropriate management of landscapes and for the planning of preventive or countermeasures. Conventional measurement techniques to infer erosion rates are limited in their temporal resolution or extent. Long-term erosion rates in larger basins have been analyzed with cosmogenic nuclides, but with lower spatial and limited temporal resolutions, thus limiting the possibility to infer micro-geomorphic and climatic controls on the timing, amount and localization of erosion. If based on exposed tree roots, rates of erosion can be inferred with up to seasonal resolution, over decades to centuries of the past and for larger surfaces with homogenous hydrological response units. Root-based erosion rates, thus, constitute a valuable alternative to empirical or physically-based approaches, especially in ungauged basins, but will be controlled by individual or a few extreme events, so that average annual rates of erosion might be highly skewed. In this contribution, we review the contribution made by this biomarker to the understanding of erosion processes and related landform evolution. We report on recent progress in root-based erosion research, illustrate possibilities, caveats and limitations of reconstructed rates, and conclude with a call for further research on various aspects of root-erosion research and for work in new geographic regions.

  19. Concurrent Quantification of Cellular and Extracellular Components of Biofilms

    PubMed Central

    Khajotia, Sharukh S.; Smart, Kristin H.; Pilula, Mpala; Thompson, David M.

    2013-01-01

    Confocal laser scanning microscopy (CLSM) is a powerful tool for investigation of biofilms. Very few investigations have successfully quantified concurrent distribution of more than two components within biofilms because: 1) selection of fluorescent dyes having minimal spectral overlap is complicated, and 2) quantification of multiple fluorochromes poses a multifactorial problem. Objectives: Report a methodology to quantify and compare concurrent 3-dimensional distributions of three cellular/extracellular components of biofilms grown on relevant substrates. Methods: The method consists of distinct, interconnected steps involving biofilm growth, staining, CLSM imaging, biofilm structural analysis and visualization, and statistical analysis of structural parameters. Biofilms of Streptococcus mutans (strain UA159) were grown for 48 hr on sterile specimens of Point 4 and TPH3 resin composites. Specimens were subsequently immersed for 60 sec in either Biotène PBF (BIO) or Listerine Total Care (LTO) mouthwashes, or water (control group; n=5/group). Biofilms were stained with fluorochromes for extracellular polymeric substances, proteins and nucleic acids before imaging with CLSM. Biofilm structural parameters calculated using ISA3D image analysis software were biovolume and mean biofilm thickness. Mixed models statistical analyses compared structural parameters between mouthwash and control groups (SAS software; α=0.05). Volocity software permitted visualization of 3D distributions of overlaid biofilm components (fluorochromes). Results: Mouthwash BIO produced biofilm structures that differed significantly from the control (p<0.05) on both resin composites, whereas LTO did not produce differences (p>0.05) on either product. Conclusions: This methodology efficiently and successfully quantified and compared concurrent 3D distributions of three major components within S. mutans biofilms on relevant substrates, thus overcoming two challenges to simultaneous assessment of

  20. Uncertainty Quantification of Calculated Temperatures for the AGR-1 Experiment

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson; Grant L. Hawkes

    2013-03-01

    This report documents an effort to quantify the uncertainty of the calculated temperature data for the first Advanced Gas Reactor (AGR-1) fuel irradiation experiment conducted in the INL’s Advanced Test Reactor (ATR) in support of the Next Generation Nuclear Plant (NGNP) R&D program. Recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, the results of the numerical simulations can be used in combination with the statistical analysis methods to improve qualification of measured data. Additionally, the temperature simulation data for AGR tests can be used for validation of the fuel transport and fuel performance simulation models. The crucial roles of the calculated fuel temperatures in ensuring achievement of the AGR experimental program objectives require accurate determination of the model temperature uncertainties. The report is organized into three chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program and provides overviews of AGR-1 measured data, AGR-1 test configuration and test procedure, and thermal simulation. Chapters 2 describes the uncertainty quantification procedure for temperature simulation data of the AGR-1 experiment, namely, (i) identify and quantify uncertainty sources; (ii) perform sensitivity analysis for several thermal test conditions; (iii) use uncertainty propagation to quantify overall response temperature uncertainty. A set of issues associated with modeling uncertainties resulting from the expert assessments are identified. This also includes the experimental design to estimate the main effects and interactions of the important thermal model parameters. Chapter 3 presents the overall uncertainty results for the six AGR-1 capsules. This includes uncertainties for the daily volume-average and peak fuel temperatures, daily average temperatures at TC locations, and time-average volume-average and time-average peak fuel temperatures.

  1. Gas Dynamics during Thermal Remediation: Visualization, Quantification and Enhancement

    NASA Astrophysics Data System (ADS)

    Mumford, K. G.; Hegele, P. R.

    2014-12-01

    In situ thermal treatment (ISTT) technologies, such as electrical resistance heating (ERH) and thermal conductive heating (TCH), rely on the in situ production of a gas phase composed of steam and vaporized volatile organic compounds (VOCs). This gas phase must be captured, extracted, and processed in an aboveground treatment system to meet remediation objectives. When used to treat volatile non-aqueous phase liquids (NAPLs), gases can be created at temperatures below the boiling points of both the groundwater and the NAPL, in a process commonly referred to as co-boiling, and vaporized VOCs can condense if gases are transported to colder regions or are not captured before thermal treatment has stopped. As such, an understanding of gas formation, connection, and flow is important for the design and operation of ISTT technologies. A recent series of laboratory experiments focused on the visualization and quantification of gas dynamics during water boiling and NAPL-water co-boiling, and the investigation of potential NAPL redistribution. Experiments were conducted in a sand-packed glass-walled chamber (40 cm tall × 20 cm wide × 1 cm thick) heated by electrical resistance. Temperatures and electric currents were measured, and digital images were captured throughout the experiments to quantify gas saturations using light transmission techniques. Additional experiments also investigated the exsolution of dissolved gas as a technique to enhance gas production at lower temperatures. Results showed the development of disconnected and connected gas flow regimes, with disconnected flow occurring at early times and during co-boiling. Results also showed the potential for NAPL redistribution due to displacement by gas formed within pools, and due to condensation in colder regions. These results highlight the need to carefully consider gases in the design of ISTT heating and gas extraction systems to ensure remediation performance.

  2. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    NASA Astrophysics Data System (ADS)

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials. This work is a partial contribution of the USDA Forest Service and NIST, agencies of the US government, and is not subject to copyright.

  3. Streptavidin conjugation and quantification-a method evaluation for nanoparticles.

    PubMed

    Quevedo, Pablo Darío; Behnke, Thomas; Resch-Genger, Ute

    2016-06-01

    Aiming at the development of validated protocols for protein conjugation of nanomaterials and the determination of protein labeling densities, we systematically assessed the conjugation of the model protein streptavidin (SAv) to 100-, 500-, and 1000-nm-sized polystyrene and silica nanoparticles and dye-encoded polymer particles with two established conjugation chemistries, based upon achievable coupling efficiencies and labeling densities. Bioconjugation reactions compared included EDC/sulfo NHS ester chemistry for direct binding of the SAv to carboxyl groups at the particle surface and maleimide-thiol chemistry in conjunction with heterobifunctional PEG linkers and aminated nanoparticles (NPs). Quantification of the total and functional amounts of SAv on these nanomaterials and unreacted SAv in solution was performed with the BCA assay and the biotin-FITC (BF) titration, relying on different signal generation principles, which are thus prone to different interferences. Our results revealed a clear influence of the conjugation chemistry on the amount of NP crosslinking, yet under optimized reaction conditions, EDC/sulfo NHS ester chemistry and the attachment via heterobifunctional PEG linkers led to comparably efficient SAv coupling and good labeling densities. Particle size can obviously affect protein labeling densities and particularly protein functionality, especially for larger particles. For unstained nanoparticles, direct bioconjugation seems to be the most efficient strategy, whereas for dye-encoded nanoparticles, PEG linkers are to be favored for the prevention of dye-protein interactions which can affect protein functionality specifically in the case of direct SAv binding. Moreover, an influence of particle size on achievable protein labeling densities and protein functionality could be demonstrated. PMID:27038055

  4. Quantification of adipose tissue in a rodent model of obesity

    NASA Astrophysics Data System (ADS)

    Johnson, David H.; Flask, Chris; Wan, Dinah; Ernsberger, Paul; Wilson, David L.

    2006-03-01

    Obesity is a global epidemic and a comorbidity for many diseases. We are using MRI to characterize obesity in rodents, especially with regard to visceral fat. Rats were scanned on a 1.5T clinical scanner, and a T1W, water-spoiled image (fat only) was divided by a matched T1W image (fat + water) to yield a ratio image related to the lipid content in each voxel. The ratio eliminated coil sensitivity inhomogeneity and gave flat values across a fat pad, except for outlier voxels (> 1.0) due to motion. Following sacrifice, fat pad volumes were dissected and measured by displacement in canola oil. In our study of 6 lean (SHR), 6 dietary obese (SHR-DO), and 9 genetically obese rats (SHROB), significant differences in visceral fat volume was observed with an average of 29+/-16 ml increase due to diet and 84+/-44 ml increase due to genetics relative to lean control with a volume of 11+/-4 ml. Subcutaneous fat increased 14+/-8 ml due to diet and 198+/-105 ml due to genetics relative to the lean control with 7+/-3 ml. Visceral fat strongly correlated between MRI and dissection (R2 = 0.94), but MRI detected over five times the subcutaneous fat found with error-prone dissection. Using a semi-automated images segmentation method on the ratio images, intra-subject variation was very low. Fat pad composition as estimated from ratio images consistently differentiated the strains with SHROB having a greater lipid concentration in adipose tissues. Future work will include in vivo studies of diet versus genetics, identification of new phenotypes, and corrective measures for obesity; technical efforts will focus on correction for motion and automation in quantification.

  5. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  6. Quantification of cytomegalovirus DNA in peripheral blood leukocytes by a branched-DNA signal amplification assay.

    PubMed Central

    Chernoff, D N; Miner, R C; Hoo, B S; Shen, L P; Kelso, R J; Jekic-McMullen, D; Lalezari, J P; Chou, S; Drew, W L; Kolberg, J A

    1997-01-01

    Quantification of cytomegalovirus (CMV) DNA in blood may aid in the identification of patients at highest risk for developing CMV disease, the evaluation of new therapeutics, and the prompt recognition of drug-resistant CMV strains. A branched-DNA (bDNA) assay was developed for the reliable quantification of CMV DNA in peripheral blood leukocytes. The bDNA assay allowed for the highly specific and reproducible quantification of CMV DNA in clinical specimens. Furthermore, the bDNA assay was at least as sensitive as culture techniques and displayed a nearly 3 log10 dynamic range in quantification. Changes in CMV DNA levels measured by the bDNA assay in a human immunodeficiency virus-positive patient undergoing therapy were consistent with CMV culture, antigen, and genotype results and correlated with disease progression and resistance markers. The bDNA assay for the quantification of CMV DNA may provide a useful tool that can be used to aid physicians in monitoring disease progression, evaluating therapeutic regimens, and recognizing viral resistance and drug failure. PMID:9350724

  7. Localized 2D COSY sequences: Method and experimental evaluation for a whole metabolite quantification approach

    NASA Astrophysics Data System (ADS)

    Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène

    2015-11-01

    Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.

  8. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods.

    PubMed

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods. PMID:27190234

  9. Quantification of low levels of amorphous content in crystalline celecoxib using dynamic vapor sorption (DVS).

    PubMed

    Sheokand, Sneha; Modi, Sameer R; Bansal, Arvind K

    2016-05-01

    A minor amount of amorphous phase, especially present on the surface of crystalline pharmaceutical actives, can have a significant impact on their processing and performance. Despite the presence of sophisticated analytical tools, detection and quantification of low levels of amorphous content pose significant challenges owing to issues of sensitivity, suitability, limit of detection and limit of quantitation. Current study encompasses the quantification of amorphous content in the crystalline form of celecoxib (CLB) using a dynamic vapor sorption (DVS) based method. Water, used as the solvent probe, achieved equilibration within a very short period of time (i.e. 6h) due to hydrophobic nature of CLB, thus allowing development of a rapid quantification method. The study included optimization of instrument and sample related parameters for the development of an analytical method. The calibration curve for amorphous CLB in crystalline CLB was prepared in the concentration range of 0-10% w/w. The analytical method was validated for linearity, range, accuracy and precision. The method for quantification was found to be linear with R(2) value of 0.999, rapid and sensitive for quantification of low levels of amorphous CLB content. It was able to detect the presence of amorphous phase in a predominantly crystalline phase at concentrations as low as 0.3% w/w. The limit of quantitation was found to be 0.9% w/w. Moreover, the influence of mechanical processing on the amorphous content in crystalline CLB was also investigated. PMID:26948976

  10. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    PubMed Central

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods. PMID:27190234

  11. Quantification of Microbial Activities in Near-Surface Soils

    NASA Astrophysics Data System (ADS)

    Schroth, M. H.; Nauer, P.; Zeyer, J.

    2007-12-01

    Microbial processes in near-surface soils play an important role in carbon and nutrient cycling, and specifically in the turnover of greenhouse gases such as CO2 and CH4. We modified a recently developed technique, the gas push-pull test (GPPT), to allow for the in-situ quantification of microbial activities in near-surface soils. A GPPT consists of the controlled injection of a gas mixture containing reactive gases (e.g., CH4, O2, CO2) and nonreactive tracer gases (e.g., Ar, Ne) into the soil, followed by the extraction of the gas mixture/soil-air blend from the same location. Rates of microbial activities are computed from the gases" breakthrough curves obtained during the GPPT's extraction phase. For a GPPT to be applied successfully, it is important that sufficient mass of the injected gases can be recovered during the test, even after prolonged incubation in soil. But this may be difficult to achieve during GPPTs performed in near- surface soils, where gas loss to the atmosphere can be substantial. Our modification consisted of performing GPPTs within a steel cylinder (8.4-cm radius), which was previously driven into the soil to a depth of 50 cm. During the GPPTs, the cylinder was temporarily closed with a removable lid to minimize gas loss to the atmosphere. We performed a series of numerical simulations as well as laboratory experiments to test the usefulness of this modification. Numerical simulations confirmed that without use of the cylinder, typical near- surface GPPTs (e.g., injection/extraction depth 20 cm below soil surface) are subject to extensive gas loss to the atmosphere (mass recovery < 20% for most gases), whereas mass recovery of injected gases increased dramatically when the cylinder was employed (mass recovery > 90% for most gases). Results from laboratory experiments confirmed this observation. We will also present results of a first field application, in which a near- surface GPPT was successfully conducted in a sandy soil to quantify in

  12. Quantification of the proliferation of arbuscular mycorrhizal fungi in soil

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Lilje, Osu; McGee, Peter

    2013-04-01

    -computer aided tomography. Micro-computer aided tomography provides three dimensional images of hyphal ramification through electron lucent materials and enables the visualization and quantification of hyphae. Starch and the mixture of starch plus K2HPO4, stimulated hyphal proliferation, while K2HPO4 alone did not change the density of hyphae. The images also indicate that fungal hyphae attached to the surfaces of the particles rather than grow through the spaces between them. The capacity to quantify hyphae in three-dimensional space allows a wide range of questions to now be addressed. Apart from studying mechanisms of carbon turnover, more complex processes may now be considered. Soil is commonly thought of as a black box. That black box is now a shade of grey.

  13. Quantification of Emphysema: A Bullae Distribution Based Approach

    NASA Astrophysics Data System (ADS)

    Tan, Kok Liang; Tanaka, Toshiyuki; Nakamura, Hidetoshi; Shirahata, Toru; Sugiura, Hiroaki

    Computed tomography (CT)-based quantifications of emphysema encompass, and are not limited to, the ratio of the low-attenuation area, the bullae size, and the distribution of bullae in the lung. The standard CT-based emphysema describing indices include the mean lung density, the percentage of area of low attenuation [the pixel index (PI)] and the bullae index (BI). These standard emphysema describing indices are not expressive for describing the distribution of bullae in the lung. Consequently, the goal of this paper is to present a new emphysema describing index, the bullae congregation index (BCI), that describes whether bullae gather in a specific area of the lung and form a nearly single mass, and if so, how dense the mass of bullae is in the lung. BCI ranges from zero to ten corresponding to sparsely distributed bullae to densely distributed bullae. BCI is calculated based on the relative distance between every pair of bullae in the lung. The bullae pair distances are sorted into 200 distance classes. A smaller distance class corresponds to a closer proximity between the bullae. BCI is derived by calculating the percentage of the area of bullae in the lung that are separated by a certain distance class. Four bullae congregation classes are defined based on BCI. We evaluate BCI using 114 CT images that are hand-annotated by a radiologist into four bullae congregation classes. The average four-class classification accuracy of BCI is 88.21%. BCI correlates better than PI, BI and other standard statistical dispersion based methods with the radiological consensus-classified bullae congregation class.While BCI is not a specific index for indicating emphysema severity, it complements the existing set of emphysema describing indices to facilitate a more thorough knowledge about the emphysematous conditions in the lung. BCI is especially useful when it comes to comparing the distribution of bullae for cases with approximately the same PI, BI or PI and BI. BCI is easy

  14. Quantification soil production and erosion using isotopic techniques

    NASA Astrophysics Data System (ADS)

    Dosseto, Anthony; Suresh, P. O.

    2010-05-01

    Soil is a critical resource, especially in the context of a rapidly growing world's population. Thus, it is crucial to be able to quantify how soil resources evolve with time and how fast they become depleted. Over the past few years, the application of cosmogenic isotopes has permitted to constrain rates of soil denudation. By assuming constant soil thickness, it is also possible to use these denudation rates to infer soil production rates (Heimsath et al. 1997). However, in this case, it is not possible to discuss any imbalance between erosion and production, which is the core question when interested in soil resource sustainability. Recently, the measurement of uranium-series isotopes in soils has been used to quantify the residence time of soil material in the weathering profile and to infer soil production rates (Dequincey et al. 2002; Dosseto et al. 2008). Thus, the combination of U-series and cosmogenic isotopes can be used to discuss how soil resources evolve with time, whether they are depleting, increasing or in steady-state. Recent work has been undertaken in temperate southeastern Australia where a several meters thick saprolite is developed over a graniodioritc bedrock and underlains a meter or less of soil (Dosseto et al., 2008) and in tropical Puerto Rico, also in a granitic catchment. Results show that in an environment where human activity is minimal, soil and saprolite are renewed as fast as they are destroyed through denudation. Further work is investigating these processes at other sites in southeastern Australia (Frogs Hollow; Heimsath et al. 2001) and Puerto Rico (Rio Mameyes catchment; andesitic bedrock). Results will be presented and a review of the quantification of the rates of soil evolution using isotopic techniques will be given. Dequincey, O., F. Chabaux, et al. (2002). Chemical mobilizations in laterites: Evidence from trace elements and 238U-234U-230Th disequilibria. Geochim. Cosmochim. Acta 66(7): 1197-1210. Dosseto, A., S. P

  15. Desert Stone Mantles: Quantification and Significance of Self-Organisation

    NASA Astrophysics Data System (ADS)

    Higgitt, David; Rosser, Nick

    2010-05-01

    Desert stone mantles exhibit sorting patterns which are evidence of self-organisation. Previous investigations of stone mantles developed on Late Tertiary and Quaternary basalts in arid northeastern Jordan, revealed distinct variations in the nature of stone cover both downslope and between lithologies of different age. However, manual field measurements of clast size and shape did not preserve information about the spatial configuration of the stone surface. Improved digital image capture and analysis techniques, including using a kite-based platform for vertical photography of the surface, has permitted the nature of stone mantles to be examined and modelled in greater detail. Image analysis has been assisted by the strong contrast in colour between the basalt clasts and the underlying surface enabling a binary classification of images, from which data on size, shape and position of clasts can be readily acquired. Quantification of self-organisation through a box-counting technique for measuring fractal dimension and a procedure using Thiessen polygons to determine ‘locking structures' indicates a general increase in organisation of the stone mantle downslope. Recognition of emergent behaviour requires an explanation in terms of positive feedback between controlling process and the influence of surface form. A series of rainfall simulation and infiltration experiments have been undertaken on plots to assess the variation in surface hydrology as a response to variations in ground surface and slope profile form. The relative contribution of runoff events of varying size and the degree to which the ground surface configuration accelerates or restricts modification of the surface influences the overall evolution of slope profiles via the erosion, transfer and deposition of both surface clasts and the underlying fine grained sediments. Critical to this modification is the interplay between the surface configuration, rainfall and runoff. The experiments presented

  16. In vivo quantification of hyperoxic arterial blood water T1.

    PubMed

    Siero, Jeroen C W; Strother, Megan K; Faraco, Carlos C; Hoogduin, Hans; Hendrikse, Jeroen; Donahue, Manus J

    2015-11-01

    Normocapnic hyperoxic and hypercapnic hyperoxic gas challenges are increasingly being used in cerebrovascular reactivity (CVR) and calibrated functional MRI experiments. The longitudinal arterial blood water relaxation time (T1a) change with hyperoxia will influence signal quantification through mechanisms relating to elevated partial pressure of plasma-dissolved O2 (pO2) and increased oxygen bound to hemoglobin in arteries (Ya) and veins (Yv). The dependence of T1a on Ya and Yv has been elegantly characterized ex vivo; however, the combined influence of pO2, Ya and Yv on T1a in vivo under normal ventilation has not been reported. Here, T1a is calculated during hyperoxia in vivo by a heuristic approach that evaluates T1 -dependent arterial spin labeling (ASL) signal changes to varying gas stimuli. Healthy volunteers (n = 14; age, 31.5 ± 7.2 years) were scanned using pseudo-continuous ASL in combination with room air (RA; 21% O2/79% N2), hypercapnic normoxic (HN; 5% CO2/21% O2/74% N2) and hypercapnic hyperoxic (HH; 5% CO2/95% O2) gas administration. HH T1a was calculated by requiring that the HN and HH cerebral blood flow (CBF) change be identical. The HH protocol was then repeated in patients (n = 10; age, 61.4 ± 13.3 years) with intracranial stenosis to assess whether an HH T1a decrease prohibited ASL from being performed in subjects with known delayed blood arrival times. Arterial blood T1a decreased from 1.65 s at baseline to 1.49 ± 0.07 s during HH. In patients, CBF values in the affected flow territory for the HH condition were increased relative to baseline CBF values and were within the physiological range (RA CBF = 36.6 ± 8.2 mL/100 g/min; HH CBF = 45.2 ± 13.9 mL/100 g/min). It can be concluded that hyperoxic (95% O2) 3-T arterial blood T1aHH = 1.49 ± 0.07 s relative to a normoxic T1a of 1.65 s. PMID:26419505

  17. Quantification of Uncertainties in Projections of Hydro-meteorological Extremes

    NASA Astrophysics Data System (ADS)

    Meresa, Hadush; Romanowicz, Renata; Lawrence, Deborah

    2016-04-01

    The impact of climate change on hydrological extremes has been widely studied particularly after the publication of the IPCC AR4 report in 2007. The methodology applied to derive hydrological extremes under climate change adopted by most scientists consists of running a cascade of models, starting from assumed emission scenarios applied to a global circulation model (GCM) and ending at hydrological model simulations. Therefore, the projected hydro-meteorological extremes are highly uncertain due to uncertainties inherent in all the links of the modelling chain. In addition, due to the complexity of hydrologic models that use a large number of parameters to characterize hydrologic processes, many challenges arise with respect to quantification of uncertainty. This issue needs to be properly quantified to understand possible confidence ranges in extremes in the future. This paper aims to quantify the uncertainty in the hydrological projection of future extremes in streamflow and precipitation indices in mountainous and lowland catchments in Poland, using a multi-model approach based on climate projections obtained from the ENSMEBLE and EUROCORDEX projects, multiple realizations of catchment scale downscaled rainfalls, two hydrological models (HBV and GR4J) and a number of hydrological model parameters. The time-span of projections covers the 21st century. The potential sources of hydrological projection uncertainties are quantified through a Monte Carlo based simulation approach. We compare the weights based on different goodness-of-fit criteria in their ability to constrain the uncertainty of the extremes. The results of the comparison show a considerable dependence of uncertainty ranges on the type of extremes (low or high flows) and on the criterion used. The predicted distribution of future streamflows considering all sources of uncertainty (climate model, bias correction and hydrological model) is used to derive marginal distributions of uncertainty related to

  18. Mandibular asymmetry: a three-dimensional quantification of bilateral condyles

    PubMed Central

    2013-01-01

    Introduction The shape and volume of the condyle is considered to play an important role in the pathogenesis of the mandibular deviation. Curvature analysis is informative for objectively assess whether the shape of the condyles matches that of the glenoid fossa. In this study, a three-dimensional (3-D) quantification of bilateral asymmetrical condyles was firstly conducted to identify the specific role of 3-D condylar configuration for mandibular asymmetry. Methods 55 adult patients, 26 males (26 ± 5 yrs) and 29 females (26 ± 5 yrs), diagnosed with mandibular asymmetry were included. The examination of deviation of chin point, deviation of dental midlines, inclination of occlusal plane, and depth of the mandibular occlusal plane were conducted. After the clinical investigation, computed tomography images from the patients were used to reconstruct the 3-D mandibular models. Then the condylar volume, surface size, surface curvature and bone mineral density were evaluated independently for each patient on non-deviated and deviated sides of temporomandibular joint. Results Both the condylar surface size and volume were significantly larger on deviated side (surface size: 1666.14 ± 318.3 mm2, volume: 1981.5 ± 418.3 mm3). The anterior slope of the condyle was flatter (0.12 ± 0.06) and the posterior slope (0.39 ± 0.08) was prominently convex on the deviated side. The corresponding bone mineral density values were 523.01 ±118.1 HU and 549.07 ±120. 6 HU on anterior and posterior slopes. Conclusions The incongruence presented on the deviated side resulted in a reduction in contact areas and, thus, an increase in contact stresses and changes of bone density. All aforementioned results suggest that the difference existing between deviated and non-deviated condyles correlates with facial asymmetrical development. In mandibular asymmetry patients, the 3-D morphology of condyle on deviated side differ from the non-deviated side, which

  19. Uncertainty quantification of bacterial aerosol neutralization in shock heated gases

    NASA Astrophysics Data System (ADS)

    Schulz, J. C.; Gottiparthi, K. C.; Menon, S.

    2015-01-01

    A potential method for the neutralization of bacterial endospores is the use of explosive charges since the high thermal and mechanical stresses in the post-detonation flow are thought to be sufficient in reducing the endospore survivability to levels that pose no significant health threat. While several experiments have attempted to quantify endospore survivability by emulating such environments in shock tube configurations, numerical simulations are necessary to provide information in scenarios where experimental data are difficult to obtain. Since such numerical predictions require complex, multi-physics models, significant uncertainties could be present. This work investigates the uncertainty in determining the endospore survivability from using a reduced order model based on a critical endospore temperature. Understanding the uncertainty in such a model is necessary in quantifying the variability in predictions using large-scale, realistic simulations of bacterial endospore neutralization by explosive charges. This work extends the analysis of previous large-scale simulations of endospore neutralization [Gottiparthi et al. in (Shock Waves, 2014. doi:10.1007/s00193-014-0504-9)] by focusing on the uncertainty quantification of predicting endospore neutralization. For a given initial mass distribution of the bacterial endospore aerosol, predictions of the intact endospore percentage using nominal values of the input parameters match the experimental data well. The uncertainty in these predictions are then investigated using the Dempster-Shafer theory of evidence and polynomial chaos expansion. The studies show that the endospore survivability is governed largely by the endospore's mass distribution and their exposure or residence time at the elevated temperatures and pressures. Deviations from the nominal predictions can be as much as 20-30 % in the intermediate temperature ranges. At high temperatures, i.e., strong shocks, which are of the most interest, the

  20. Robust Radiomics Feature Quantification Using Semiautomatic Volumetric Segmentation

    PubMed Central

    Leijenaar, Ralph; Jermoumi, Mohammed; Carvalho, Sara; Mak, Raymond H.; Mitra, Sushmita; Shankar, B. Uma; Kikinis, Ron; Haibe-Kains, Benjamin; Lambin, Philippe; Aerts, Hugo J. W. L.

    2014-01-01

    Due to advances in the acquisition and analysis of medical imaging, it is currently possible to quantify the tumor phenotype. The emerging field of Radiomics addresses this issue by converting medical images into minable data by extracting a large number of quantitative imaging features. One of the main challenges of Radiomics is tumor segmentation. Where manual delineation is time consuming and prone to inter-observer variability, it has been shown that semi-automated approaches are fast and reduce inter-observer variability. In this study, a semiautomatic region growing volumetric segmentation algorithm, implemented in the free and publicly available 3D-Slicer platform, was investigated in terms of its robustness for quantitative imaging feature extraction. Fifty-six 3D-radiomic features, quantifying phenotypic differences based on tumor intensity, shape and texture, were extracted from the computed tomography images of twenty lung cancer patients. These radiomic features were derived from the 3D-tumor volumes defined by three independent observers twice using 3D-Slicer, and compared to manual slice-by-slice delineations of five independent physicians in terms of intra-class correlation coefficient (ICC) and feature range. Radiomic features extracted from 3D-Slicer segmentations had significantly higher reproducibility (ICC = 0.85±0.15, p = 0.0009) compared to the features extracted from the manual segmentations (ICC = 0.77±0.17). Furthermore, we found that features extracted from 3D-Slicer segmentations were more robust, as the range was significantly smaller across observers (p = 3.819e-07), and overlapping with the feature ranges extracted from manual contouring (boundary lower: p = 0.007, higher: p = 5.863e-06). Our results show that 3D-Slicer segmented tumor volumes provide a better alternative to the manual delineation for feature quantification, as they yield more reproducible imaging descriptors. Therefore, 3D-Slicer can be

  1. Robust Radiomics feature quantification using semiautomatic volumetric segmentation.

    PubMed

    Parmar, Chintan; Rios Velazquez, Emmanuel; Leijenaar, Ralph; Jermoumi, Mohammed; Carvalho, Sara; Mak, Raymond H; Mitra, Sushmita; Shankar, B Uma; Kikinis, Ron; Haibe-Kains, Benjamin; Lambin, Philippe; Aerts, Hugo J W L

    2014-01-01

    Due to advances in the acquisition and analysis of medical imaging, it is currently possible to quantify the tumor phenotype. The emerging field of Radiomics addresses this issue by converting medical images into minable data by extracting a large number of quantitative imaging features. One of the main challenges of Radiomics is tumor segmentation. Where manual delineation is time consuming and prone to inter-observer variability, it has been shown that semi-automated approaches are fast and reduce inter-observer variability. In this study, a semiautomatic region growing volumetric segmentation algorithm, implemented in the free and publicly available 3D-Slicer platform, was investigated in terms of its robustness for quantitative imaging feature extraction. Fifty-six 3D-radiomic features, quantifying phenotypic differences based on tumor intensity, shape and texture, were extracted from the computed tomography images of twenty lung cancer patients. These radiomic features were derived from the 3D-tumor volumes defined by three independent observers twice using 3D-Slicer, and compared to manual slice-by-slice delineations of five independent physicians in terms of intra-class correlation coefficient (ICC) and feature range. Radiomic features extracted from 3D-Slicer segmentations had significantly higher reproducibility (ICC = 0.85±0.15, p = 0.0009) compared to the features extracted from the manual segmentations (ICC = 0.77±0.17). Furthermore, we found that features extracted from 3D-Slicer segmentations were more robust, as the range was significantly smaller across observers (p = 3.819e-07), and overlapping with the feature ranges extracted from manual contouring (boundary lower: p = 0.007, higher: p = 5.863e-06). Our results show that 3D-Slicer segmented tumor volumes provide a better alternative to the manual delineation for feature quantification, as they yield more reproducible imaging descriptors. Therefore, 3D-Slicer can be

  2. Quantification of brain lipids by FTIR spectroscopy and partial least squares regression

    NASA Astrophysics Data System (ADS)

    Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph

    2009-01-01

    Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.

  3. Quantification of nerolidol in mouse plasma using gas chromatography–mass spectrometry

    PubMed Central

    Saito, Alexandre Yukio; Sussmann, Rodrigo Antonio Ceschini; Kimura, Emilia Akemi; Cassera, Maria Belen; Katzin, Alejandro Miguel

    2015-01-01

    Nerolidol is a naturally occurring sesquiterpene found in the essential oils of many types of flowers and plants. It is frequently used in cosmetics, as a food flavoring agent, and in cleaning products. In addition, nerolidol is used as a skin penetration enhancer for transdermal delivery of therapeutic drugs. However, nerolidol is hemolytic at low concentrations. A simple and fast GC–MS method was developed for preliminary quantification and assessment of biological interferences of nerolidol in mouse plasma after oral dosing. Calibration curves were linear in the concentration range of 0.010–5 μg/mL nerolidol in mouse plasma with correlation coefficients (r) greater than 0.99. Limits of detection and quantification were 0.0017 and 0.0035 μg/mL, respectively. The optimized method was successfully applied to the quantification of nerolidol in mouse plasma. PMID:25880240

  4. Application of synchrotron radiation computed microtomography for quantification of bone microstructure in human and rat bones

    NASA Astrophysics Data System (ADS)

    Nogueira, Liebert Parreiras; Barroso, Regina Cély; de Almeida, André Pereira; Braz, Delson; de Almeida, Carlos Eduardo; de Andrade, Cherley Borba; Tromba, Giuliana

    2012-05-01

    This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 μm tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software. Resolution yielded was excellent what facilitate quantification of bone microstructures.

  5. Optimization of diclofenac quantification from wastewater treatment plant sludge by ultrasonication assisted extraction.

    PubMed

    Topuz, Emel; Sari, Sevgi; Ozdemir, Gamze; Aydin, Egemen; Pehlivanoglu-Mantas, Elif; Okutman Tas, Didem

    2014-05-01

    A rapid quantification method of diclofenac from sludge samples through ultrasonication assisted extraction and solid phase extraction (SPE) was developed and used for the quantification of diclofenac concentrations in sludge samples with liquid chromatography/tandem mass spectrometry (LC-MS/MS). Although the concentration of diclofenac in sludge samples taken from different units of wastewater treatment plants in Istanbul was below the limit of quantification (LOQ; 5ng/g), an optimized method for sludge samples along with the total mass balances in a wastewater treatment plant can be used to determine the phase with which diclofenac is mostly associated. Hence, the results will provide information on fate and transport of diclofenac, as well as on the necessity of alternative removal processes. In addition, since the optimization procedure is provided in detail, it is possible for other researchers to use this procedure as a starting point for the determination of other emerging pollutants in wastewater sludge samples. PMID:24704687

  6. Mass spectrometry based proteomics for absolute quantification of proteins from tumor cells

    PubMed Central

    Wang, Hong; Hanash, Sam

    2015-01-01

    In-depth quantitative profiling of the proteome and sub-proteomes of tumor cells has relevance to tumor classification, the development of novel therapeutics, and of prognostic and predictive markers and to disease monitoring. In particular the tumor cell surface represents a highly relevant compartment for the development of targeted therapeutics and immunotherapy. We have developed a proteomic platform to profile tumor cells that encompasses enrichment of surface membrane proteins, intact protein fractionation and label-free mass spectrometry based absolute quantification. Here we describe the methodology for capture, identification and quantification of cell surface proteins using biotinylation for labeling of the cell surface, avidin for capture of biotinylated proteins and ion mobility mass spectrometry for protein identification and quantification. PMID:25794949

  7. Quantification and normalization of noise variance with sparsity regularization to enhance diffuse optical tomography

    PubMed Central

    Yao, Jixing; Tian, Fenghua; Rakvongthai, Yothin; Oraintara, Soontorn; Liu, Hanli

    2015-01-01

    Conventional reconstruction of diffuse optical tomography (DOT) is based on the Tikhonov regularization and the white Gaussian noise assumption. Consequently, the reconstructed DOT images usually have a low spatial resolution. In this work, we have derived a novel quantification method for noise variance based on the linear Rytov approximation of the photon diffusion equation. Specifically, we have implemented this quantification of noise variance to normalize the measurement signals from all source-detector channels along with sparsity regularization to provide high-quality DOT images. Multiple experiments from computer simulations and laboratory phantoms were performed to validate and support the newly developed algorithm. The reconstructed images demonstrate that quantification and normalization of noise variance with sparsity regularization (QNNVSR) is an effective reconstruction approach to greatly enhance the spatial resolution and the shape fidelity for DOT images. Since noise variance can be estimated by our derived expression with relatively limited resources available, this approach is practically useful for many DOT applications. PMID:26309760

  8. Application of synchrotron radiation computed microtomography for quantification of bone microstructure in human and rat bones

    SciTech Connect

    Parreiras Nogueira, Liebert; Barroso, Regina Cely; Pereira de Almeida, Andre; Braz, Delson; Almeida, Carlos Eduardo de; Borba de Andrade, Cherley; Tromba, Giuliana

    2012-05-17

    This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 {mu}m tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software. Resolution yielded was excellent what facilitate quantification of bone microstructures.

  9. Antioxidant Activity and Validation of Quantification Method for Lycopene Extracted from Tomato.

    PubMed

    Cefali, Letícia Caramori; Cazedey, Edith Cristina Laignier; Souza-Moreira, Tatiana Maria; Correa, Marcos Antônio; Salgado, Hérida Regina Nunes; Isaac, Vera Lucia Borges

    2015-01-01

    Lycopene is a carotenoid found in tomatoes with potent antioxidant activity. The aim of the study was to obtain an extract containing lycopene from four types of tomatoes, validate a quantification method for the extracts by HPLC, and assess its antioxidant activity. Results revealed that the tomatoes analyzed contained lycopene and antioxidant activity. Salad tomato presented the highest concentration of this carotenoid and antioxidant activity. The quantification method exhibited linearity with a correlation coefficient of 0.9992. Tests for the assessment of precision, accuracy, and robustness achieved coefficients with variation of less than 5%. The LOD and LOQ were 0.0012 and 0.0039 μg/mL, respectively. Salad tomato can be used as a source of lycopene for the development of topical formulations, and based on performed tests, the chosen method for the identification and quantification of lycopene was considered to be linear, precise, exact, selective, and robust. PMID:26525253

  10. An Approach for Assessing RNA-seq Quantification Algorithms in Replication Studies

    PubMed Central

    Wu, Po-Yen; Phan, John H.; Wang, May D.

    2016-01-01

    One way to gain a more comprehensive picture of the complex function of a cell is to study the transcriptome. A promising technology for studying the transcriptome is RNA sequencing, an application of which is to quantify elements in the transcriptome and to link quantitative observations to biology. Although numerous quantification algorithms are publicly available, no method of systematically assessing these algorithms has been developed. To meet the need for such an assessment, we present an approach that includes (1) simulated and real datasets, (2) three alignment strategies, and (3) six quantification algorithms. Examining the normalized root-mean-square error, the percentage error of the coefficient of variation, and the distribution of the coefficient of variation, we found that quantification algorithms with the input of sequence alignment reported in the transcriptomic coordinate usually performed better in terms of the multiple metrics proposed in this study.

  11. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    NASA Astrophysics Data System (ADS)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  12. Quantification of nerolidol in mouse plasma using gas chromatography-mass spectrometry.

    PubMed

    Saito, Alexandre Yukio; Sussmann, Rodrigo Antonio Ceschini; Kimura, Emilia Akemi; Cassera, Maria Belen; Katzin, Alejandro Miguel

    2015-01-01

    Nerolidol is a naturally occurring sesquiterpene found in the essential oils of many types of flowers and plants. It is frequently used in cosmetics, as a food flavoring agent, and in cleaning products. In addition, nerolidol is used as a skin penetration enhancer for transdermal delivery of therapeutic drugs. However, nerolidol is hemolytic at low concentrations. A simple and fast GC-MS method was developed for preliminary quantification and assessment of biological interferences of nerolidol in mouse plasma after oral dosing. Calibration curves were linear in the concentration range of 0.010-5 μg/mL nerolidol in mouse plasma with correlation coefficients (r) greater than 0.99. Limits of detection and quantification were 0.0017 and 0.0035 μg/mL, respectively. The optimized method was successfully applied to the quantification of nerolidol in mouse plasma. PMID:25880240

  13. Uncertainty quantification in the presence of limited climate model data with discontinuities.

    SciTech Connect

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2009-12-01

    Uncertainty quantification in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We develop a methodology that performs uncertainty quantification in this context in the presence of limited data.

  14. In vivo quantification of cochlin in glaucomatous DBA/2J mice using optical coherence tomography

    PubMed Central

    Wang, Jianhua; Aljohani, Ayman; Carreon, Teresia; Gregori, Giovanni; Bhattacharya, Sanjoy K.

    2015-01-01

    The expression of cochlin in the trabecular meshwork (TM) precedes the clinical glaucoma symptoms in DBA/2J mice. The ability to quantify cochlin in the local tissue (TM) offers potential diagnostic and prognostic values. We present two (spectroscopic and magnetomotive) optical coherence tomography (OCT) approaches for in vivo cochlin quantification in a periodic manner. The cochlin-antibody OCT signal remains stable for up to 24 hours as seen at 3.5 hours after injection allowing for repeated quantification in the living mouse eyes. PMID:26047051

  15. Challenges for the in vivo quantification of brain neuropeptides using microdialysis sampling and LC-MS.

    PubMed

    Van Wanseele, Yannick; De Prins, An; De Bundel, Dimitri; Smolders, Ilse; Van Eeckhaut, Ann

    2016-09-01

    In recent years, neuropeptides and their receptors have received an increased interest in neuropharmacological research. Although these molecules are considered relatively small compared with proteins, their in vivo quantification using microdialysis is more challenging than for small molecules. Low microdialysis recoveries, aspecific adsorption and the presence of various multiply charged precursor ions during ESI-MS/MS detection hampers the in vivo quantification of these low abundant biomolecules. Every step in the workflow, from sampling until analysis, has to be optimized to enable the sensitive analysis of these compounds in microdialysates. PMID:27554986

  16. Quantification of hydrogen peroxide during the low-temperature oxidation of alkanes

    PubMed Central

    Bahrini, Chiheb; Herbinet, Olivier; Glaude, Pierre-Alexandre; Schoemaecker, Coralie; Fittschen, Christa; Battin-Leclerc, Frédérique

    2013-01-01

    The first reliable quantification of hydrogen peroxide (H2O2) formed during the low temperature oxidation of an organic compound has been achieved thanks to a new system that couples a jet stirred reactor to a detection by continuous wave cavity ring-down spectroscopy (cw-CRDS) in the near infrared. The quantification of this key compound for hydrocarbon low-temperature oxidation regime has been obtained under conditions close to those actually observed before the autoignition. The studied hydrocarbon was n-butane, the smallest alkane which has an oxidation behaviour close to that of the species present in gasoline and diesel fuels. PMID:22746212

  17. RNA-Skim: a rapid method for RNA-Seq quantification at transcript level

    PubMed Central

    Zhang, Zhaojun; Wang, Wei

    2014-01-01

    Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering

  18. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    PubMed

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area

  19. Synthesis of nanodiamond derivatives carrying amino functions and quantification by a modified Kaiser test

    PubMed Central

    Jarre, Gerald; Heyer, Steffen; Memmel, Elisabeth; Meinhardt, Thomas

    2014-01-01

    Summary Nanodiamonds functionalized with different organic moieties carrying terminal amino groups have been synthesized. These include conjugates generated by Diels–Alder reactions of ortho-quinodimethanes formed in situ from pyrazine and 5,6-dihydrocyclobuta[d]pyrimidine derivatives. For the quantification of primary amino groups a modified photometric assay based on the Kaiser test has been developed and validated for different types of aminated nanodiamond. The results correspond well to values obtained by thermogravimetry. The method represents an alternative wet-chemical quantification method in cases where other techniques like elemental analysis fail due to unfavourable combustion behaviour of the analyte or other impediments. PMID:25550737

  20. Dakota uncertainty quantification methods applied to the NEK-5000 SAHEX model.

    SciTech Connect

    Weirs, V. Gregory

    2014-03-01

    This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.

  1. Application of relative quantification TaqMan real-time polymerase chain reaction technology for the identification and quantification of Thunnus alalunga and Thunnus albacares.

    PubMed

    Lopez, Itziar; Pardo, Miguel Angel

    2005-06-01

    A novel one-step methodology based on real-time Polymerase Chain Reaction (PCR) technology has been developed for the identification of two of the most valuable tuna species. Nowadays, species identification of seafood products has a major concern due to the importing to Europe of new species from other countries. To achieve this aim, two specific TaqMan systems were devised to identify Thunnus alalunga and Thunnus albacares. Another system specific to Scombroidei species was devised as a consensus system. In addition, a relative quantification methodology was carried out to quantify T. alalunga and T. albacares in mixtures after the relative amount of the target was compared with the consensus. This relative quantification methodology does not require a known amount of standard, allowing the analysis of many more samples together and saving costs and time. The utilization of real-time PCR does not require sample handling, preventing contamination and resulting in much faster and higher throughput results. PMID:15913324

  2. Secondary Students' Quantification of Ratio and Rate: A Framework for Reasoning about Change in Covarying Quantities

    ERIC Educational Resources Information Center

    Johnson, Heather Lynn

    2015-01-01

    Contributing to a growing body of research addressing secondary students' quantitative and covariational reasoning, the multiple case study reported in this article investigated secondary students' quantification of ratio and rate. This article reports results from a study investigating students' quantification of rate and ratio as…

  3. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  4. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    SciTech Connect

    Liu, Zhen; Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; van Bloemen Waanders, Bart Gustaaf; LaFranchi, Brian W.; Ivey, Mark D.; Schrader, Paul E.; Michelsen, Hope A.; Bambha, Ray P.

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  5. Quantification of subsurface pore pressure through IODP drilling

    NASA Astrophysics Data System (ADS)

    Saffer, D. M.; Flemings, P. B.

    2010-12-01

    It is critical to understand the magnitude and distribution of subsurface pore fluid pressure: it controls effective stress and thus mechanical strength, slope stability, and sediment compaction. Elevated pore pressures also drive fluid flows that serve as agents of mass, solute, and heat fluxes. The Ocean Drilling Program (ODP) and Integrated Ocean Drilling Program (IODP) have provided important avenues to quantify pore pressure in a range of geologic and tectonic settings. These approaches include 1) analysis of continuous downhole logs and shipboard physical properties data to infer compaction state and in situ pressure and stress, 2) laboratory consolidation testing of core samples collected by drilling, 3) direct downhole measurements using pore pressure probes, 3) pore pressure and stress measurements using downhole tools that can be deployed in wide diameter pipe recently acquired for riser drilling, and 4) long-term monitoring of formation pore pressure in sealed boreholes within hydraulically isolated intervals. Here, we summarize key advances in quantification of subsurface pore pressure rooted in scientific drilling, highlighting with examples from subduction zones, the Gulf of Mexico, and the New Jersey continental shelf. At the Nankai, Costa Rican, and Barbados subduction zones, consolidation testing of cores samples, combined with analysis of physical properties data, indicates that even within a few km landward of the trench, pore pressures in and below plate boundary décollement zones reach a significant fraction of the lithostatic load (λ*=0.25-0.91). These results document a viable and quantifiable mechanism to explain the mechanical weakness of subduction décollements, and are corroborated by a small number of direct measurements in sealed boreholes and by inferences from seismic reflection data. Recent downhole measurements conducted during riser drilling using the modular formation dynamics tester wireline tool (MDT) in a forearc basin ~50

  6. Quantification of confidence in a geological model of Cumbria, UK

    NASA Astrophysics Data System (ADS)

    Waters, Colin; Lark, Murray; Mathers, Steve; Marchant, Andrew; Hulbert, Andrew

    2015-04-01

    A three-dimensional geological model of Cumbria was constructed by several geologists, applying expert judgment to interpret available data, as both a fence network of cross-sections in GSI3d and surfaces in GOCAD®. Direct statistical measures of uncertainty of the model are not available. Neither is it feasible to undertake post hoc sampling at additional independent boreholes to estimate measures of model uncertainty. The study considered various qualitative and quantitative approaches to assessing the modelled surfaces and volumes. Modellers make judgments about the relative quality of different types of available data and the extent to which simple trends in the units of interest (e.g. a gentle dip) allow their structure to be extrapolated with confidence away from observations. Confidence decays with increasing distance from a hard observation, such as a field exposure, an interpreted borehole, or a "softer" observation such as a geophysical measurement. In the study area it is possible to make qualitative assessments of four distinct structural domains, marked by different levels in the confidence of the interpretation of the geological model through factors such as availability of deep borehole data, seismic lines, surface exposure and the complexity of the bedrock geology and an appraisal of the extent, amount and quality of the data used to constrain the boundaries presented within the model. The study also attempted to provide various quantitative approaches to assess the type and distribution of data. The quantification of a Confidence Index uses expert elicitation to assess the certainty of subsurface interpretations of modelled surfaces and volumes based upon a statistical analysis of the proximity to subsurface data (boreholes and seismic data). Application of this approach is presented as elevation and thickness grids for a principal aquifer in the region. This approach is directly applicable in areas where bedrock strata are poorly exposed and the

  7. High Resolution Quantification of Cellular Forces for Rigidity Sensing

    NASA Astrophysics Data System (ADS)

    Liu, Shuaimin

    This thesis describes a comprehensive study of understanding the mechanism of rigidity sensing by quantitative analysis using submicron pillar array substrates. From mechanobiology perspective, we explore and study molecular pathways involved in rigidity and force sensing at cell-matrix adhesions with regard to cancer, regeneration, and development by quantification methods. In Chapter 2 and 3, we developed fabrication and imaging techniques to enhance the performance of a submicron pillar device in terms of spatial and temporal measurement ability, and we discovered a correlation of rigidity sensing forces and corresponding proteins involved in the early rigidity sensing events. In Chapter 2, we introduced optical effect arising from submicron structure imaging, and we described a technique to identify the correct focal plane of pillar tip by fabricating a substrate with designed-offset pillars. From calibration result, we identified the correct focal plane that was previously overlooked, and verified our findings by other imaging techniques. In Chapter 3, we described several techniques to selectively functionalize elastomeric pillars top and compared these techniques in terms of purposes and fabrication complexity. Techniques introduced in this chapter included direct labeling, such as stamping of fluorescent substances (organic dye, nano-diamond, q-dot) to pillars top, as well as indirect labeling that selectively modify the surface of molds with either metal or fluorescent substances. In Chapter 4, we examined the characteristics of local contractility forces and identified the components formed a sarcomere like contractile unit (CU) that cells use to sense rigidity. CUs were found to be assembled at cell edge, contain myosin II, alpha-actinin, tropomodulin and tropomyosin (Tm), and resemble sarcomeres in size (˜2 mum) and function. Then we performed quantitative analysis of CUs to evaluate rigidity sensing activity over ˜8 hours time course and found that

  8. Photoacoustic sensor system for the quantification of soot aerosols (abstract)

    NASA Astrophysics Data System (ADS)

    Haisch, C.; Beck, H.; Niessner, R.

    2003-01-01

    The influence of soot particles on human health as well as global and local climate is well established by now. Hence, the need for fast and sensitive soot detection in urban and remote areas is obvious. The state of the art thermochemical detection methods for soot analysis is based on filter sampling and subsequent wet chemical analysis and combustion, which requires laborious and time consuming sample preparation. Due to the integration on a filter, a time-resolved analysis is not possible. The presented photoacoustic sensor system is optimized for a highly sensitive and fast on-line and in situ quantification of soot. Soot particles, as classical "black absorbers," absorb electromagnetic radiation over the whole spectrum. Two similar systems are introduced. The first system is designed for the development and testing of combustion engines, mainly the next generation of diesel engines. In the next decade, legal thresholds for extremely low particle emissions are foreseen. Their implementation will be only possible if a time-resolved soot detection with sufficient sensitivity can be realized as the highest particle emissions from diesel engines are generated only for seconds during load changes. During a load change, the emitted soot concentrations can rise several orders of magnitude for only a period of few seconds. The system combines a time resolution of 1 s (sampling rate 1 Hz) with an aerosol mass sensitivity better than 10 μg m-3. Up to a maximum dimension of about 800 nm the signal is independent of the particle size. The systems consist of two photoacoustic cells, which are operated in a differential mode to avoid cross sensitivities. The cells are built as acoustical resonators to increase sensitivity. A diode laser with a wavelength of 810 nm and an output power of 1.1 W is employed for excitation. Its collimated beam passes first through the reference cell and then through the measurement cell. To avoid condensation of water, the cells are heated to

  9. Quantification of Water Erosion on Subalpine Grassland with Rain Simulators

    NASA Astrophysics Data System (ADS)

    Schindler, Y.; Alewell, Ch.; Burri, K.; Bänninger, D.

    2009-04-01

    Intensive land use and increasing storm events trigger rain erosion, thus its quantification is important. The aim of this study was to asses the influence of the vegetation on runoff and water erosion in an alpine grassland area. Further, we estimated the influence of vegetation on the soil characteristics matrix stability and C/N ratio and assessed the relationship between those parameters as well as the grain size distribution with erosion and runoff rate. To test the above hypotheses a field spray nozzles drop former hybrid simulator, consisting of a full-core Lechler nozzle and a meshed fixed below to improve the rain drop distribution, was used. Prior to the field experiment, we compared this simulator with a drop former simulator in the laboratory at the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL) in terms of drop size distribution and kinetic energy. Thereby, we could estimate the accuracy of the field simulator. The rain drop size distribution and the total kinetic energy of the drops at a rain intensity of 60 mm h-1 were measured with a Joss-Waldvogel distrometer. To compare the effect of the two rain simulators as well as the influence of the soil texture on erosion and runoff rate, we used 6 silty soil monoliths and 6 clayish monoliths. To get comparable initial conditions, every soil monolith was irrigated only one time, starting at field capacity. The soil moisture was continuously recorded by TDR probes during the simulation. The comparison of the two rain simulators showed a close similarity in the drop size distributions. For both simulators, the most frequent drop size class is in the range of 1 mm in diameter. Natural rain typically shows a larger mean drop size at an intensity of 60 mm h-1. In comparison to the natural rain, the total kinetic energy of the simulated rain of both of the simulators was too small as well. These results lead to the conclusion, that the true simulation of a natural rain is hardly realizable

  10. Quantification of the molecular species of tetraacylglycerols in lesquerella (Physaria fendleri) Oil by HPLC and MS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Thirteen molecular species of tetraacylglycerols in the seed oil of Physaria fendleri were recently identified. We report here the quantification of these tetraacylglycerols using HPLC with evaporative light scattering detector and the MS of the HPLC fractions. Ion signal intensities of MS1 from th...

  11. A Clinical Method for the Detection and Quantification of Quick Respiratory Hyperkinesia

    ERIC Educational Resources Information Center

    Hixon, Thomas J.; Hoit, Jeannette D.

    2006-01-01

    Purpose: Quick respiratory hyperkinesia can be difficult to detect with the naked eye. A clinical method is described for the detection and quantification of quick respiratory hyperkinesia. Method: Flow at the airway opening is sensed during spontaneous apnea (rest), voluntary breath holding (postural fixation), and voluntary volume displacement…

  12. Visualization and quantification of evolving datasets. Final report: 8-1-93 - 4-30-97

    SciTech Connect

    Zabusky, N.; Silver, D.

    1999-07-20

    The material below is the final technical/progress report of the Laboratory for Visiometrics and Modeling (Vizlab) in visiometrics for the grant entitled Visualization and Quantification of Evolving Phenomena. This includes coordination with DOE supported scientists at Los Alamos National Laboratory (LANL) and Princeton Plasma Physics Laboratory (PPPL), and with theoretical and computational physicists at the National Institute of Fusion Science (NIFS) in Nagoya, Japan and the Institute of Laser Engineering (ILE) in Osaka, Japan. The authors research areas included: Enhancement and distribution of the DAVID environment, this is a 2D visualization environment incorporating many advanced quantifications and diagnostics useful for prediction, understanding, and reduced model formation; Feature extraction, tracking and quantification of 3D time-dependent datasets of non-linear and turbulent simulations both compressible and incompressible. This work is applicable to all 3D time-varying simulations; Visiometrics in shock-interface interactions and mixing for the Richtmyer-Meshkov (RM) environment. This work highlights reduced models for nonlinear evolutions and the role of density stratified interfaces (contact discontinuities) and has application to supernova physics, laser fusion and supersonic combustion. The collaborative projects included areas of (1) Feature extraction, tracking and quantification in 3D turbulence: compressible and incompressible; (2) Numerical Tokamak Project (NTP); (3) Data projection and reduced modeling for shock-interface interactions and mixing. (The Richtmyer-Meshkov (RM) environment relevant to laser fusion and combustion).

  13. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  14. Quantification and comparison of neurosurgical approaches in the preclinical setting: literature review.

    PubMed

    Doglietto, F; Radovanovic, I; Ravichandiran, M; Agur, A; Zadeh, G; Qiu, J; Kucharczyk, W; Fernandez, E; Fontanella, M M; Gentili, F

    2016-07-01

    There is a growing awareness of the need for evidence-based surgery and of the issues that are specific to research in surgery. Well-conducted anatomical studies can represent the first, preclinical step for evidence-based surgical innovation and evaluation. In the last two decades, various reports have quantified and compared neurosurgical approaches in the anatomy laboratory using different methods and technology. The aim of this study was to critically review these papers. A PubMed and Scopus search was performed to select articles that quantified and compared different neurosurgical approaches in the preclinical setting. The basic characteristics that anatomically define a surgical approach were defined. Each study was analyzed for measured features and quantification method and technique. Ninety-nine papers, published from 1990 to 2013, were included in this review. A heterogeneous use of terms to define the features of a surgical approach was evident. Different methods to study these features have been reported; they are generally based on quantification of distances, angles, and areas. Measuring tools have evolved from the simple ruler to frameless stereotactic devices. The reported methods have each specific advantages and limits; a common limitation is the lack of 3D visualization and surgical volume quantification. There is a need for a uniform nomenclature in anatomical studies. Frameless stereotactic devices provide a powerful tool for anatomical studies. Volume quantification and 3D visualization of the surgical approach is not provided with most available methods. PMID:26782812

  15. Automated pericardial fat quantification from coronary magnetic resonance angiography: feasibility study.

    PubMed

    Ding, Xiaowei; Pang, Jianing; Ren, Zhou; Diaz-Zamudio, Mariana; Jiang, Chenfanfu; Fan, Zhaoyang; Berman, Daniel S; Li, Debiao; Terzopoulos, Demetri; Slomka, Piotr J; Dey, Damini

    2016-01-01

    Pericardial fat volume (PFV) is emerging as an important parameter for cardiovascular risk stratification. We propose a hybrid approach for automated PFV quantification from water/fat-resolved whole-heart noncontrast coronary magnetic resonance angiography (MRA). Ten coronary MRA datasets were acquired. Image reconstruction and phase-based water-fat separation were conducted offline. Our proposed algorithm first roughly segments the heart region on the original image using a simplified atlas-based segmentation with four cases in the atlas. To get exact boundaries of pericardial fat, a three-dimensional graph-based segmentation is used to generate fat and nonfat components on the fat-only image. The algorithm then selects the components that represent pericardial fat. We validated the quantification results on the remaining six subjects and compared them with manual quantifications by an expert reader. The PFV quantified by our algorithm was [Formula: see text], compared to [Formula: see text] by the expert reader, which were not significantly different ([Formula: see text]) and showed excellent correlation ([Formula: see text],[Formula: see text]). The mean absolute difference in PFV between the algorithm and the expert reader was [Formula: see text]. The mean value of the paired differences was [Formula: see text] (95% confidence interval: [Formula: see text] to 6.21). The mean Dice coefficient of pericardial fat voxels was [Formula: see text]. Our approach may potentially be applied in a clinical setting, allowing for accurate magnetic resonance imaging (MRI)-based PFV quantification without tedious manual tracing. PMID:26958578

  16. Comparison of isolation and quantification methods to measure humic-like substances (HULIS) in atmospheric particles

    NASA Astrophysics Data System (ADS)

    Fan, Xingjun; Song, Jianzhong; Peng, Ping'an

    2012-12-01

    Humic-like Substances (HULIS) comprise a significant fraction of the water-soluble organic aerosol mass and influence the cloud microphysical properties and climate effects of aerosols in the atmosphere. In this work, the most frequently used HULIS isolation and quantification methods including ENVI-18, HLB, XAD-8 and DEAE were comparatively characterized with two model standards, ten interfering compounds, and five ambient aerosol samples. Quantification of HULIS is performed with a TOC analyzer, complemented by an investigation of the chemical structure of the extracted fractions by UV-Vis spectroscopy. The results show that the four isolation methods were all characterized by high reliability, high reproducibility, and low limit of detection (LOD), indicating that each method can be used to efficiently recover Suwannee River Fulvic Acid (SRFA) and be applied to the quantification of the lower amount of HULIS in atmospheric particles. The analytical results of the UV-Vis spectra of HULIS fractions isolated also indicate that they are all favorable for extraction of compounds of high UV absorbance, high MW, and high aromaticity and that the DEAE protocol is the most significant one. Compared with the DEAE method that favors extraction of highly UV-absorbing and more aromatic compounds, SRFA isolated by the ENVI-18, HLB, and XAD-8 protocols were more representative of the global matrix. Each method has its own advantages and disadvantages and is suitable for a particular application. No single method is ideal for both isolation and quantification of HULIS in atmospheric samples.

  17. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    PubMed

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. PMID:25603128

  18. Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students

    ERIC Educational Resources Information Center

    Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.

    2014-01-01

    A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…

  19. What Hinders Child Semantic Computation: Children's Universal Quantification and the Development of Cognitive Control

    ERIC Educational Resources Information Center

    Minai, Utako; Jincho, Nobuyuki; Yamane, Naoto; Mazuka, Reiko

    2012-01-01

    Recent studies on the acquisition of semantics have argued that knowledge of the universal quantifier is adult-like throughout development. However, there are domains where children still exhibit non-adult-like universal quantification, and arguments for the early mastery of relevant semantic knowledge do not explain what causes such…

  20. Quantification of fungicides in snow-melt runoff from turf: A comparison of four extraction methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A variety of pesticides are used to control diverse stressors to turf. These pesticides have a wide range in physical and chemical properties. The objective of this project was to develop an extraction and analysis method for quantification of chlorothalonil and PCNB (pentachloronitrobenzene), two p...

  1. A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.

    EPA Science Inventory

    John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192).

    We have developed a simple, mild extraction procedure using methanol which, when...

  2. Quantification of excess water loss in plant canopies warmed with infrared heating

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Here we investigate the extent to which infrared heating used to warm plant canopies in climate manipulation experiments increases transpiration. Concerns regarding the impact of the infrared heater technique on the water balance have been raised before, but a quantification is lacking. We calculate...

  3. Samplers for evaluation and quantification of ultra-low volume space sprays

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A field study was conducted to investigate the suitability of sampling devices for quantification of spray deposition from ULV space sprays. Five different samplers were included in an experiment conducted in an open grassy field. Samplers included horizontally stretched stationary cotton ribbon at ...

  4. How to Improve Your Impact Factor: Questioning the Quantification of Academic Quality

    ERIC Educational Resources Information Center

    Smeyers, Paul; Burbules, Nicholas C.

    2011-01-01

    A broad-scale quantification of the measure of quality for scholarship is under way. This trend has fundamental implications for the future of academic publishing and employment. In this essay we want to raise questions about these burgeoning practices, particularly how they affect philosophy of education and similar sub-disciplines. First,…

  5. Quantification of taurine in energy drinks using ¹H NMR.

    PubMed

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. PMID:24094700

  6. Being Something: Prospects for a Property-Based Approach to Predicative Quantification

    ERIC Educational Resources Information Center

    Rieppel, Michael Olivier

    2013-01-01

    Few questions concerning the character of our talk about the world are more basic than how predicates combine with names to form truth-evaluable sentences. One particularly intriguing fact that any account of predication needs to make room for is that natural language allows for quantification into predicate position, through constructions like…

  7. Complex Quantification in Structured Query Language (SQL): A Tutorial Using Relational Calculus

    ERIC Educational Resources Information Center

    Kawash, Jalal

    2004-01-01

    The Structured Query Language (SQL) forms a substantial component of introductory database courses and is supported by almost every commercial database product. One disadvantage of SQL is that it does not provide a universal quantification construct. Queries that have twisted universal and existential quantifiers can be stunning for students,…

  8. Geometric Foundation and Quantification of the Flow in a Verbal Expression.

    ERIC Educational Resources Information Center

    Bierschenk, Bernhard

    This paper presents the geometric foundation and quantification of Agent-action-Objective (AaO) kinematics. The meaningfulness of studying the flows in verbal expressions through splitting and splicing the strings in a verbal flow related to the fact that free parameters are not needed since it is not required that the presented methodological…

  9. Quantification of Dehalospirillum multivorans in Mixed-Culture Biofilms with an Enzyme-Linked Immunosorbent Assay

    PubMed Central

    Bauer-Kreisel, P.; Eisenbeis, M.; Scholz-Muramatsu, H.

    1996-01-01

    A fast, highly selective and sensitive method to quantify specific biomasses in mixed-culture biofilms is described. It consists of detachment of a biofilm from its support material, resolution of the detached biofilm flocs in order to separate the enclosed cells and antigens, and quantification of specific biomass by an enzyme-linked immunosorbent assay. PMID:16535389

  10. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    PubMed

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. PMID:25466124

  11. Towards Quantification of Functional Breast Images Using Dedicated SPECT With Non-Traditional Acquisition Trajectories

    PubMed Central

    Perez, Kristy L.; Cutler, Spencer J.; Madhav, Priti; Tornai, Martin P.

    2012-01-01

    Quantification of radiotracer uptake in breast lesions can provide valuable information to physicians in deciding patient care or determining treatment efficacy. Physical processes (e.g., scatter, attenuation), detector/collimator characteristics, sampling and acquisition trajectories, and reconstruction artifacts contribute to an incorrect measurement of absolute tracer activity and distribution. For these experiments, a cylinder with three syringes of varying radioactivity concentration, and a fillable 800 mL breast with two lesion phantoms containing aqueous 99mTc pertechnetate were imaged using the SPECT sub-system of the dual-modality SPECT-CT dedicated breast scanner. SPECT images were collected using a compact CZT camera with various 3D acquisitions including vertical axis of rotation, 30° tilted, and complex sinusoidal trajectories. Different energy windows around the photopeak were quantitatively compared, along with appropriate scatter energy windows, to determine the best quantification accuracy after attenuation and dual-window scatter correction. Measured activity concentrations in the reconstructed images for syringes with greater than 10 µCi /mL corresponded to within 10% of the actual dose calibrator measured activity concentration for ±4% and ±8% photopeak energy windows. The same energy windows yielded lesion quantification results within 10% in the breast phantom as well. Results for the more complete complex sinsusoidal trajectory are similar to the simple vertical axis acquisition, and additionally allows both anterior chest wall sampling, no image distortion, and reasonably accurate quantification. PMID:22262925

  12. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  13. Quantification of Dehalospirillum multivorans in Mixed-Culture Biofilms with an Enzyme-Linked Immunosorbent Assay.

    PubMed

    Bauer-Kreisel, P; Eisenbeis, M; Scholz-Muramatsu, H

    1996-08-01

    A fast, highly selective and sensitive method to quantify specific biomasses in mixed-culture biofilms is described. It consists of detachment of a biofilm from its support material, resolution of the detached biofilm flocs in order to separate the enclosed cells and antigens, and quantification of specific biomass by an enzyme-linked immunosorbent assay. PMID:16535389

  14. Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...

  15. Investigating Whole Root Systems: Advances in Root Quantification Tools and Techniques

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The efficient quantification of root traits remains a critical factor in exploiting many genetic resources during the study of root function and development. This is particularly true for the high throughput phenotyping of large populations for acid soil tolerance, including aluminum (Al) tolerance...

  16. DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...

  17. Transcriptome assembly and quantification from Ion Torrent RNA-Seq data

    PubMed Central

    2014-01-01

    Background High throughput RNA sequencing (RNA-Seq) can generate whole transcriptome information at the single transcript level providing a powerful tool with multiple interrelated applications including transcriptome reconstruction and quantification. The sequences of novel transcripts can be reconstructed from deep RNA-Seq data, but this is computationally challenging due to sequencing errors, uneven coverage of expressed transcripts, and the need to distinguish between highly similar transcripts produced by alternative splicing. Another challenge in transcriptomic analysis comes from the ambiguities in mapping reads to transcripts. Results We present MaLTA, a method for simultaneous transcriptome assembly and quantification from Ion Torrent RNA-Seq data. Our approach explores transcriptome structure and incorporates a maximum likelihood model into the assembly and quantification procedure. A new version of the IsoEM algorithm suitable for Ion Torrent RNA-Seq reads is used to accurately estimate transcript expression levels. The MaLTA-IsoEM tool is publicly available at: http://alan.cs.gsu.edu/NGS/?q=malta Conclusions Experimental results on both synthetic and real datasets show that Ion Torrent RNA-Seq data can be successfully used for transcriptome analyses. Experimental results suggest increased transcriptome assembly and quantification accuracy of MaLTA-IsoEM solution compared to existing state-of-the-art approaches. PMID:25082147

  18. Comparison of biochemical and microscopic methods for quantification of mycorrhizal fungi in soil and roots

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...

  19. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 1

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  20. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 3

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  1. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING LIGHT TRANSMISSION VISUALIZATION METHOD

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  2. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 2

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  3. Detection and Quantification of Human Fecal Pollution with Real-Time PCR

    EPA Science Inventory

    ABSTRACT Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described ...

  4. Improved quantification of pathogen DNA from soil using pressure cycling technology

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Detection and quantification of Rhizoctonia, Pythium and other soilborne pathogens are inconsistent at low pathogen populations and in hard-to-extract samples, despite use of sensitive diagnostic assays such as real-time PCR. An efficient and reproducible extraction system in which samples are subj...

  5. Identification and absolute quantification of enzymes in laundry detergents by liquid chromatography tandem mass spectrometry.

    PubMed

    Gaubert, Alexandra; Jeudy, Jérémy; Rougemont, Blandine; Bordes, Claire; Lemoine, Jérôme; Casabianca, Hervé; Salvador, Arnaud

    2016-07-01

    In a stricter legislative context, greener detergent formulations are developed. In this way, synthetic surfactants are frequently replaced by bio-sourced surfactants and/or used at lower concentrations in combination with enzymes. In this paper, a LC-MS/MS method was developed for the identification and quantification of enzymes in laundry detergents. Prior to the LC-MS/MS analyses, a specific sample preparation protocol was developed due to matrix complexity (high surfactant percentages). Then for each enzyme family mainly used in detergent formulations (protease, amylase, cellulase, and lipase), specific peptides were identified on a high resolution platform. A LC-MS/MS method was then developed in selected reaction monitoring (SRM) MS mode for the light and corresponding heavy peptides. The method was linear on the peptide concentration ranges 25-1000 ng/mL for protease, lipase, and cellulase; 50-1000 ng/mL for amylase; and 5-1000 ng/mL for cellulase in both water and laundry detergent matrices. The application of the developed analytical strategy to real commercial laundry detergents enabled enzyme identification and absolute quantification. For the first time, identification and absolute quantification of enzymes in laundry detergent was realized by LC-MS/MS in a single run. Graphical Abstract Identification and quantification of enzymes by LC-MS/MS. PMID:27098933

  6. Immobilized Metal Affinity Chromatography Coupled to Multiple Reaction Monitoring Enables Reproducible Quantification of Phospho-signaling.

    PubMed

    Kennedy, Jacob J; Yan, Ping; Zhao, Lei; Ivey, Richard G; Voytovich, Uliana J; Moore, Heather D; Lin, Chenwei; Pogosova-Agadjanyan, Era L; Stirewalt, Derek L; Reding, Kerryn W; Whiteaker, Jeffrey R; Paulovich, Amanda G

    2016-02-01

    A major goal in cell signaling research is the quantification of phosphorylation pharmacodynamics following perturbations. Traditional methods of studying cellular phospho-signaling measure one analyte at a time with poor standardization, rendering them inadequate for interrogating network biology and contributing to the irreproducibility of preclinical research. In this study, we test the feasibility of circumventing these issues by coupling immobilized metal affinity chromatography (IMAC)-based enrichment of phosphopeptides with targeted, multiple reaction monitoring (MRM) mass spectrometry to achieve precise, specific, standardized, multiplex quantification of phospho-signaling responses. A multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay targeting phospho-analytes responsive to DNA damage was configured, analytically characterized, and deployed to generate phospho-pharmacodynamic curves from primary and immortalized human cells experiencing genotoxic stress. The multiplexed assays demonstrated linear ranges of ≥3 orders of magnitude, median lower limit of quantification of 0.64 fmol on column, median intra-assay variability of 9.3%, median inter-assay variability of 12.7%, and median total CV of 16.0%. The multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay enabled robust quantification of 107 DNA damage-responsive phosphosites from human cells following DNA damage. The assays have been made publicly available as a resource to the community. The approach is generally applicable, enabling wide interrogation of signaling networks. PMID:26621847

  7. A whole-cell electrochemical biosensing system based on bacterial inward electron flow for fumarate quantification.

    PubMed

    Si, Rong-Wei; Zhai, Dan-Dan; Liao, Zhi-Hong; Gao, Lu; Yong, Yang-Chun

    2015-06-15

    Fumarate is of great importance as it is an oncometabolite as well as food spoilage indicator. However, cost-effective and fast quantification method for fumarate is lacking although it is urgently required. This work developed an electrochemical whole-cell biosensing system for fumarate quantification. A sensitive inwards electric output (electron flow from electrode into bacteria) responded to fumarate in Shewanella oneidensis MR-1 was characterized, and an electrochemical fumarate biosensing system was developed without genetic engineering. The biosensing system delivered symmetric current peak immediately upon fumarate addition, where the peak area increased in proportion to the increasing fumarate concentration with a wide range of 2 μM-10 mM (R(2)=0.9997). The limit of detection (LOD) and the limit of quantification (LOQ) are 0.83 μM and 1.2 μM, respectively. This biosensing system displayed remarkable specificity to fumarate against other possible interferences. It was also successfully applied to samples of apple juice and kidney tissue. This study added new dimension to electrochemical biosensor design, and provide a simple, cost-effective, fast and robust tool for fumarate quantification. PMID:25558872

  8. New Real-Time Quantitative PCR Procedure for Quantification of Bifidobacteria in Human Fecal Samples

    PubMed Central

    Gueimonde, Miguel; Tölkkö, Satu; Korpimäki, Teemu; Salminen, Seppo

    2004-01-01

    The application of a real-time quantitative PCR method (5′ nuclease assay), based on the use of a probe labeled at its 5′ end with a stable, fluorescent lanthanide chelate, for the quantification of human fecal bifidobacteria was evaluated. The specificities of the primers and the primer-probe combination were evaluated by conventional PCR and real-time PCR, respectively. The results obtained by real-time PCR were compared with those obtained by fluorescent in situ hybridization, the current gold standard for intestinal microbiota quantification. In general, a good correlation between the two methods was observed. In order to determine the detection limit and the accuracy of the real-time PCR procedure, germfree rat feces were spiked with known amounts of bifidobacteria and analyzed by both methods. The detection limit of the method used in this study was found to be about 5 × 104 cells per g of feces. Both methods, real-time PCR and fluorescent in situ hybridization, led to an accurate quantification of the spiked samples with high levels of bifidobacteria, but real-time PCR was more accurate for samples with low levels. We conclude that the real-time PCR procedure described here is a specific, accurate, rapid, and easy method for the quantification of bifidobacteria in feces. PMID:15240297

  9. Towards Quantification of Functional Breast Images Using Dedicated SPECT With Non-Traditional Acquisition Trajectories.

    PubMed

    Perez, Kristy L; Cutler, Spencer J; Madhav, Priti; Tornai, Martin P

    2011-10-01

    Quantification of radiotracer uptake in breast lesions can provide valuable information to physicians in deciding patient care or determining treatment efficacy. Physical processes (e.g., scatter, attenuation), detector/collimator characteristics, sampling and acquisition trajectories, and reconstruction artifacts contribute to an incorrect measurement of absolute tracer activity and distribution. For these experiments, a cylinder with three syringes of varying radioactivity concentration, and a fillable 800 mL breast with two lesion phantoms containing aqueous (99m)Tc pertechnetate were imaged using the SPECT sub-system of the dual-modality SPECT-CT dedicated breast scanner. SPECT images were collected using a compact CZT camera with various 3D acquisitions including vertical axis of rotation, 30° tilted, and complex sinusoidal trajectories. Different energy windows around the photopeak were quantitatively compared, along with appropriate scatter energy windows, to determine the best quantification accuracy after attenuation and dual-window scatter correction. Measured activity concentrations in the reconstructed images for syringes with greater than 10 µCi /mL corresponded to within 10% of the actual dose calibrator measured activity concentration for ±4% and ±8% photopeak energy windows. The same energy windows yielded lesion quantification results within 10% in the breast phantom as well. Results for the more complete complex sinsusoidal trajectory are similar to the simple vertical axis acquisition, and additionally allows both anterior chest wall sampling, no image distortion, and reasonably accurate quantification. PMID:22262925

  10. Will 3-dimensional PET-CT enable the routine quantification of myocardial blood flow?

    PubMed

    deKemp, Robert A; Yoshinaga, Keiichiro; Beanlands, Rob S B

    2007-01-01

    Quantification of myocardial blood flow (MBF) and flow reserve has been used extensively with positron emission tomography (PET) to investigate the functional significance of coronary artery disease. Increasingly, flow quantification is being applied to investigations of microvascular dysfunction in early atherosclerosis and in nonatherosclerotic microvascular disease associated with primary and secondary cardiomyopathies. Fully three-dimensional (3D) acquisition is becoming the standard imaging mode on new equipment, bringing with it certain challenges for cardiac PET, but also the potential for MBF to be measured simultaneously with routine electrocardiography (ECG)-gated perfusion imaging. Existing 3D versus 2D comparative studies support the use of 3D cardiac PET for flow quantification, and these protocols can be translated to PET-CT, which offers a virtually noise-free attenuation correction. This technology combines the strengths of cardiac CT for evaluation of anatomy with cardiac PET for quantification of the hemodynamic impact on the myocardium. High throughput clinical imaging protocols are needed to evaluate the incremental diagnostic and prognostic value of this technology. PMID:17556173

  11. Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.

    PubMed

    Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro

    2016-09-01

    Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards. PMID:27432553

  12. Quantification of event-related desynchronization/synchronization at low frequencies in a semantic memory task.

    PubMed

    Gómez, Juan; Aguilar, Mónica; Horna, Eduardo; Minguez, Javier

    2012-01-01

    Although several techniques have been developed for the visualization of EEG event-related desynchronization/synchronization (ERD/ERS) in both time and frequency domains, none of the quantification methods takes advantage of the time and frequency resolution at the same time. Existing techniques for the quantification of the ERD/ERS changes compute the average EEG power increase/decrease relative to certain reference value, over fixed time intervals and/or frequency bands (either fixed or individualized). Inaccuracy in the computation of these frequency bands (where the process is actually measured) in combination with the averaging process over time may lead to errors in the computation of any ERD/ERS quantification parameter. In this paper, we present a novel method for the automatic, individual and exact quantification of the most significant ERD/ERS region within a given window of the time-frequency domain. The method is exemplified by quantifying the ERS at low frequencies of 10 subjects performing a semantic memory task, and compared with existing techniques. PMID:23366438

  13. Unification of gene expression data applying standard mRNA quantification references for comparable analyses

    Technology Transfer Automated Retrieval System (TEKTRAN)

    High throughput quantitative measurements of gene expression data have problems of reproducibility and comparability due to a lack of standard mRNA quantification references. Efforts have been made to safeguard data fidelity, yet generating quality expression data of inherent value remains a challe...

  14. Identification and Quantification of Pathogenic Rhizoctonia solani and R. oryzae Using Real-time PCR

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Rhizoctonia solani and R. oryzae are the principal causal agents of Rhizoctonia root rot in dryland cereal production systems of the Pacific Northwest. To facilitate the identification and quantification of these pathogens in agricultural samples, we developed SYBR Green I-based real-time quantitati...

  15. Sensitive Targeted Quantification of ERK Phosphorylation Dynamics and Stoichiometry in Human Cells without Affinity Enrichment

    DOE PAGESBeta

    Shi, Tujin; Gao, Yuqian; Gaffrey, Matthew J.; Nicora, Carrie D.; Fillmore, Thomas L.; Chrisler, William B.; Gritsenko, Marina A.; Wu, Chaochao; He, Jintang; Bloodsworth, Kent J.; et al

    2014-12-17

    Mass spectrometry-based targeted quantification is a promising technology for site-specific quantification of posttranslational modifications (PTMs). However, a major constraint of most targeted MS approaches is the limited sensitivity for quantifying low-abundance PTMs, requiring the use of affinity reagents to enrich specific PTMs. Herein, we demonstrate the direct site-specific quantification of ERK phosphorylation isoforms (pT, pY, pTpY) and their relative stoichiometries using a highly sensitive targeted MS approach termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM). PRISM provides effective enrichment of target peptides within a given fraction from complex biological matrix with minimal sample losses, followed by selected reactionmore » monitoring (SRM) quantification. The PRISM-SRM approach enabled direct quantification of ERK phosphorylation in human mammary epithelial cells (HMEC) from as little as 25 µg tryptic peptides from whole cell lysates. Compared to immobilized metal-ion affinity chromatography, PRISM provided >10-fold improvement in signal intensities, presumably due to the better peptide recovery of PRISM for handling small size samples. This approach was applied to quantify ERK phosphorylation dynamics in HMEC treated by different doses of EGF at both the peak activation (10 min) and steady state (2 h). At 10 min, the maximal ERK activation was observed with 0.3 ng/mL dose, whereas the maximal steady state level of ERK activation at 2 h was at 3 ng/ml dose, corresponding to 1200 and 9000 occupied receptors, respectively. At 10 min, the maximally activated pTpY isoform represented ~40% of total ERK, falling to less than 10% at 2 h. The time course and dose-response profiles of individual phosphorylated ERK isoforms indicated that singly phosphorylated pT-ERK never increases significantly, while the increase of pY-ERK paralleled that of pTpY-ERK. This data supports for a processive, rather than

  16. Sensitive Targeted Quantification of ERK Phosphorylation Dynamics and Stoichiometry in Human Cells without Affinity Enrichment

    SciTech Connect

    Shi, Tujin; Gao, Yuqian; Gaffrey, Matthew J.; Nicora, Carrie D.; Fillmore, Thomas L.; Chrisler, William B.; Gritsenko, Marina A.; Wu, Chaochao; He, Jintang; Bloodsworth, Kent J.; Zhao, Rui; Camp II, David G.; Liu, Tao; Rodland, Karin D.; Smith, Richard D.; Wiley, H. Steven; Qian, Weijun

    2014-12-17

    Mass spectrometry-based targeted quantification is a promising technology for site-specific quantification of posttranslational modifications (PTMs). However, a major constraint of most targeted MS approaches is the limited sensitivity for quantifying low-abundance PTMs, requiring the use of affinity reagents to enrich specific PTMs. Herein, we demonstrate the direct site-specific quantification of ERK phosphorylation isoforms (pT, pY, pTpY) and their relative stoichiometries using a highly sensitive targeted MS approach termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM). PRISM provides effective enrichment of target peptides within a given fraction from complex biological matrix with minimal sample losses, followed by selected reaction monitoring (SRM) quantification. The PRISM-SRM approach enabled direct quantification of ERK phosphorylation in human mammary epithelial cells (HMEC) from as little as 25 µg tryptic peptides from whole cell lysates. Compared to immobilized metal-ion affinity chromatography, PRISM provided >10-fold improvement in signal intensities, presumably due to the better peptide recovery of PRISM for handling small size samples. This approach was applied to quantify ERK phosphorylation dynamics in HMEC treated by different doses of EGF at both the peak activation (10 min) and steady state (2 h). At 10 min, the maximal ERK activation was observed with 0.3 ng/mL dose, whereas the maximal steady state level of ERK activation at 2 h was at 3 ng/ml dose, corresponding to 1200 and 9000 occupied receptors, respectively. At 10 min, the maximally activated pTpY isoform represented ~40% of total ERK, falling to less than 10% at 2 h. The time course and dose-response profiles of individual phosphorylated ERK isoforms indicated that singly phosphorylated pT-ERK never increases significantly, while the increase of pY-ERK paralleled that of pTpY-ERK. This data supports for a processive, rather than

  17. Union Exon Based Approach for RNA-Seq Gene Quantification: To Be or Not to Be?

    PubMed Central

    Zhao, Shanrong; Xi, Li; Zhang, Baohong

    2015-01-01

    In recent years, RNA-seq is emerging as a powerful technology in estimation of gene and/or transcript expression, and RPKM (Reads Per Kilobase per Million reads) is widely used to represent the relative abundance of mRNAs for a gene. In general, the methods for gene quantification can be largely divided into two categories: transcript-based approach and ‘union exon’-based approach. Transcript-based approach is intrinsically more difficult because different isoforms of the gene typically have a high proportion of genomic overlap. On the other hand, ‘union exon’-based approach method is much simpler and thus widely used in RNA-seq gene quantification. Biologically, a gene is expressed in one or more transcript isoforms. Therefore, transcript-based approach is logistically more meaningful than ‘union exon’-based approach. Despite the fact that gene quantification is a fundamental task in most RNA-seq studies, however, it remains unclear whether ‘union exon’-based approach for RNA-seq gene quantification is a good practice or not. In this paper, we carried out a side-by-side comparison of ‘union exon’-based approach and transcript-based method in RNA-seq gene quantification. It was found that the gene expression levels are significantly underestimated by ‘union exon’-based approach, and the average of RPKM from ‘union exons’-based method is less than 50% of the mean expression obtained from transcript-based approach. The difference between the two approaches is primarily affected by the number of transcripts in a gene. We performed differential analysis at both gene and transcript levels, respectively, and found more insights, such as isoform switches, are gained from isoform differential analysis. The accuracy of isoform quantification would improve if the read coverage pattern and exon-exon spanning reads are taken into account and incorporated into EM (Expectation Maximization) algorithm. Our investigation discourages the use of

  18. Osteosarcoma Microenvironment: Whole-Slide Imaging and Optimized Antigen Detection Overcome Major Limitations in Immunohistochemical Quantification

    PubMed Central

    Kunz, Pierre; Fellenberg, Jörg; Moskovszky, Linda; Sápi, Zoltan; Krenacs, Tibor; Poeschl, Johannes; Lehner, Burkhard; Szendrõi, Miklos; Ewerbeck, Volker; Kinscherf, Ralf; Fritzsching, Benedikt

    2014-01-01

    Background In osteosarcoma survival rates could not be improved over the last 30 years. Novel biomarkers are warranted to allow risk stratification of patients for more individual treatment following initial diagnosis. Although previous studies of the tumor microenvironment have identified promising candidates, novel biomarkers have not been translated into routine histopathology. Substantial difficulties regarding immunohistochemical detection and quantification of antigens in decalcified and heterogeneous osteosarcoma might largely explain this translational short-coming. Furthermore, we hypothesized that conventional hot spot analysis is often not representative for the whole section when applied to heterogeneous tissues like osteosarcoma. We aimed to overcome these difficulties for major biomarkers of the immunovascular microenvironment. Methods Immunohistochemistry was systematically optimized for cell surface (CD31, CD8) and intracellular antigens (FOXP3) including evaluation of 200 different antigen retrieval conditions. Distribution patterns of these antigens were analyzed in formalin-fixed and paraffin-embedded samples from 120 high-grade central osteosarcoma biopsies and computer-assisted whole-slide analysis was compared with conventional quantification methods including hot spot analysis. Results More than 96% of osteosarcoma samples were positive for all antigens after optimization of immunohistochemistry. In contrast, standard immunohistochemistry retrieved false negative results in 35–65% of decalcified osteosarcoma specimens. Standard hot spot analysis was applicable for homogeneous distributed FOXP3+ and CD8+ cells. However, heterogeneous distribution of vascular CD31 did not allow reliable quantification with hot spot analysis in 85% of all samples. Computer-assisted whole-slide analysis of total CD31- immunoreactive area proved as the most appropriate quantification method. Conclusion Standard staining and quantification procedures are not

  19. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  20. Characterization and LC-MS/MS based quantification of hydroxylated fullerenes

    PubMed Central

    Chao, Tzu-Chiao; Song, Guixue; Hansmeier, Nicole; Westerhoff, Paul; Herckes, Pierre; Halden, Rolf U.

    2011-01-01

    Highly water-soluble hydroxylated fullerene derivatives are being investigated for a wide range of commercial products as well as for potential cytotoxicity. However, no analytical methods are currently available for their quantification at sub-ppm concentrations in environmental matrices. Here, we report on the development and comparison of liquid chromatography-ultra violet/visible spectroscopy (LC-UV/vis) and mass spectrometry (LC-MS) based detection and quantification methods for a commercial fullerols. We achieved good separation efficiency using an amide-type hydrophilic interaction liquid chromatography (HILIC) column (plate number >2000) under isocratic conditions with 90% acetonitrile as the mobile phase. The method detection limits (MDLs) ranged from 42.8 ng/mL (UV detection) to 0.19 pg/mL (using MS with multiple reaction monitoring, MRM). Other MS measurement modes achieved MDLs of 125 pg/mL (single quad scan, Q1) and 1.5 pg/mL (multiple ion monitoring, MI). Each detection method exhibited a good linear response over several orders of magnitude. Moreover, we tested the robustness of these methods in the presence of Suvanee River fulvic acids (SRFA) as an example of organic matter commonly found in environmental water samples. While SRFA significantly interfered with UV- and Q1-based quantifications, the interference was relatively low using MI or MRM (relative error in presence of SRFA: 8.6% and 2.5%, respectively). This first report of a robust MS-based quantification method for modified fullerenes dissolved in water suggests the feasibility of implementing MS techniques more broadly for identification and quantification of fullerols and other water-soluble fullerene derivatives in environmental samples. PMID:21294534